WEAK MULTI-LABEL AUDIO-TAGGING WITH CLASS NOISE
@inproceedings{Prinz2019WEAKMA, title={WEAK MULTI-LABEL AUDIO-TAGGING WITH CLASS NOISE}, author={Katharina Prinz and Arthur Flexer}, year={2019} }
The necessity of annotated data for supervised learning often contrasts with the cost of obtaining reliable ground-truth in a manual fashion. Automated methods, on the other hand, simplify the annotation process and result in greater quantities of data with possibly noisy labels. Task 2 of the DCASE2019 Challenge, titled "Audio tagging with noisy labels and minimal supervision", tried to answer the question whether such data can be incorporated into an audio-tagging learning process in a…
Tables from this paper
One Citation
The Impact of Label Noise on a Music Tagger
- Computer ScienceArXiv
- 2020
It is shown that carefully annotated labels result in highest figures of merit, but even high amounts of noisy labels contain enough information for successful learning.
References
SHOWING 1-5 OF 5 REFERENCES
Audio tagging with noisy labels and minimal supervision
- Computer ScienceDCASE
- 2019
This paper presents the task setup, the FSDKaggle2019 dataset prepared for this scientific evaluation, and a baseline system consisting of a convolutional neural network.
AUDIO TAGGING WITH CONVOLUTIONAL NEURAL NETWORKS TRAINED WITH NOISY DATA Technical Report
- Computer Science
- 2019
An ensemble that provides us with the likelihood of 80 different labels being present in an input audio clip is obtained by averaging over the predictions of all five networks, and reaches a Label Weighted Label Ranking Average Precision of 0.722.
On the Stratification of Multi-label Data
- Computer ScienceECML/PKDD
- 2011
This paper considers two stratification methods for multi- label data and empirically compares them along with random sampling on a number of datasets and reveals some interesting conclusions with respect to the utility of each method for particular types of multi-label datasets.
Freesound Datasets: A Platform for the Creation of Open Audio Datasets
- Computer ScienceISMIR
- 2017
Comunicacio presentada al 18th International Society for Music Information Retrieval Conference celebrada a Suzhou, Xina, del 23 al 27 d'cotubre de 2017.
mixup: Beyond Empirical Risk Minimization
- Computer ScienceICLR
- 2018
This work proposes mixup, a simple learning principle that trains a neural network on convex combinations of pairs of examples and their labels, which improves the generalization of state-of-the-art neural network architectures.