• Publications
  • Influence
Neural Percussive Synthesis Parameterised by High-Level Timbral Features
TLDR
This approach allows for intuitive control of a synthesizer, enabling the user to shape sounds without extensive knowledge of signal processing, using a feedforward convolutional neural network-based architecture, which is able to map input parameters to the corresponding waveform.
Data Augmentation for Instrument Classification Robust to Audio Effects
Comunicacio presentada a la 22a International Conference on Digital Audio Effects (DAFx-19) que se celebra del 2 al 6 de setembre de 2019 a Birmingham, Regne Unit.
Automatic transcription of vocalized percussion
TLDR
A Max for Live device capable of automatically transcribing vocalised percussion will be created and an annotated dataset of vocalised drum kits will be presented, followed by a user-specific device that will select the most relevant features to be used for the user vocalisations.
User Specific Adaptation in Automatic Transcription of Vocalised Percussion
TLDR
An easy-to-use and user-oriented system capable of automatically transcribing vocalisations of percussion sounds, called LVT - Live Vocalised Transcription, is presented.
TIV.lib: an open-source library for the tonal description of musical audio
TLDR
TIV.lib is an open-source library for the content-based tonal description of musical audio signals that relies on the perceptually-inspired Tonal Interval Vector space based on the Discrete Fourier transform, which has potential for enhanced Music Information Retrieval.
An audio-only method for advertisement detection in broadcast television content
TLDR
This work presents an audio-only method for the detection of short silences which exist at the boundaries between programming and advertising, as well as between the advertisements themselves.
The Freesound Loop Dataset and Annotation Tool
TLDR
The Freesound Loop Dataset (FSLD), a new large-scale dataset of music loops annotated by experts, is presented and it is anticipated that the community will find yet more uses for the data, in applications from automatic loop characterisation to algorithmic composition.
Instrument Role Classification: Auto-tagging for Loop Based Music
TLDR
A new type of auto-tagging task, called “instrument role classification,” is introduced, and the performance of both neural network and non-neural network based multi-label classification models for six instrument roles is benchmarked.
Loopnet: Musical Loop Synthesis Conditioned on Intuitive Musical Parameters
TLDR
This work presents LoopNet, a feed-forward generative model for creating loops conditioned on intuitive parameters and proposes intuitive controls for composers to map the ideas in their minds to an audio loop.
Freesound Loop Dataset
...
...