A proposal and evaluation of new timbre visualization methods for audio sample browsers

@article{Richan2020APA,
  title={A proposal and evaluation of new timbre visualization methods for audio sample browsers},
  author={Etienne Richan and Jean Rouat},
  journal={Personal and Ubiquitous Computing},
  year={2020},
  pages={1-14}
}
Searching through vast libraries of sound samples can be a daunting and time-consuming task. Modern audio sample browsers use mappings between acoustic properties and visual attributes to visually differentiate displayed items. There are few studies focused on how well these mappings help users search for a specific sample. We propose new methods for generating textural labels and positioning samples based on perceptual representations of timbre. We perform a series of studies to evaluate the… 
3 Citations

Figures and Tables from this paper

A General Framework for Visualization of Sound Collections in Musical Interfaces
TLDR
A general framework for devising interactive applications based on the content-based visualization of sound collections, which allows for a modular combination of different techniques for sound segmentation, analysis, and dimensionality reduction, using the reduced feature space for interactive applications.
Sketching sounds: an exploratory study on sound-shape associations
TLDR
Results imply that the development of a synthesiser exploiting sound-shape associations is feasible, but a larger and more focused dataset is needed in followup studies.

References

SHOWING 1-10 OF 75 REFERENCES
A study comparing shape, colour and texture as visual labels in audio sample browsers
TLDR
A study to compare shape, colour and texture as visual labels in a known-item search task finds that shape and colour outperform texture.
Visualization of perceptual qualities in Textural sounds
TLDR
The use of metaphoric sensory properties that are shared between sounds and graphics are proposed, constructing a meaningful mapping of auditory to visual dimensions, essentially combining low-dimensional projection and iconic representation.
AudioMetro: directing search for sound designers through content-based cues
TLDR
It is shown through user evaluations by known-item search in collections of textural sounds that a default grid layout ordered by filename unexpectedly outperforms content-based similarity layouts resulting from a recent dimension reduction technique.
Aurally and visually enhanced audio search with soundtorch
TLDR
An enhanced auditory and graphical zoomable user interface that leverages the human brain's capability to single out sounds from a spatial mixture that can be picked more quickly and/or with more fun than with standard file-by-file auditioning.
SoundTorch: Quick Browsing in Large Audio Collections
TLDR
User tests show that this method can leverage the human brain’s capability to single out sounds from a spatial mixture and enhance browsing in large collections of audio content.
Animating Timbre - A User Study
TLDR
A study to investigate the visualisation of acoustic timbre features using various visual features of a 3D rendered object suggests both that individual preferences change when multiple parameters are varied, and that there is no general consensus on preferred mappings in the multivariate case.
Adaptive Mapping of Sound Collections for Data-driven Musical Interfaces
TLDR
A novel framework for automatic creation of interactive sound spaces from sound collections using feature learning and dimensionality reduction is proposed and implemented as a software library using the SuperCollider language.
MusicGalaxy: A Multi-focus Zoomable Interface for Multi-facet Exploration of Music Collections
TLDR
An adaptive zoomable interface for exploration that makes use of a complex non-linear multi-focal zoom lens that exploits the distorted neighborhood relations introduced by the projection and introduces the concept of facet distances representing different aspects of music similarity.
A comparative evaluation of auditory-visual mappings for sound visualisation
The significant role of visual communication in modern computer applications is indisputable. In the case of music, various attempts have been made from time to time to translate non-visual ideas
SOUND SEARCH BY CONTENT-BASED NAVIGATION IN LARGE DATABASES
We propose to apply the principle of interactive real-time corpus-based concatenative synthesis to search in effects or instrument sound databases, which becomes content-based navigation in a space
...
1
2
3
4
5
...