DCASE Datalist

Introduction

Have you ever searched for the state-of-the-art performance on a dataset, and if so, how were you sure you found it?

DCASE Benchmarks is a centralized, community-driven repository to keep track of state-of-the-art performance of DCASE and related topics. The repository powers a website that serves to visualize results and access the corresponding materials. DCASE Benchmarks relies on the community, thus it uses GitHub functionality so that anyone can contribute to it. The page democratizes information and resources to anyone, including external and new members of the community.

DCASE Benchmarks is currently in a beta testing stage, for any issues or suggestions contact Toni, Benjamin or Hamid.

Maintained by Toni Heittola, Benjamin Elizalde and Hamid Eghbalzadeh. Thanks to Khaled Koutini, Bongjun Kim and Eduardo Fonseca for beta-testing the website.

How to contribute

  1. Fork this repository: https://github.com/DCASE-REPO/dcase_benchmarks.
  2. Create a new yaml file under publications/, name the file using bib-style index (e.g, publications/elizalde2022.yaml). Use template templates/publication_template.yaml and update the required fields with the information from your publication..
    1. Information about the datasets, metrics or tasks are defined in info/datasets.yaml, info/metrics.yaml and info/tasks.yaml.
    2. If you are inserting publication using a new dataset, metric or task, add information about those into these files.
  3. You can test the site and verify that the publication information will be shown properly by running update.py and starting the local server with start_local_server.py. Site can be access at http://localhost:8000/
  4. Create a pull request (PR) to add your edits to the main repository.