A study comparing shape, colour and texture as visual labels in audio sample browsers

@article{Richan2019ASC,
  title={A study comparing shape, colour and texture as visual labels in audio sample browsers},
  author={Etienne Richan and Jean Rouat},
  journal={Proceedings of the 14th International Audio Mostly Conference: A Journey in Sound},
  year={2019}
}
  • Etienne Richan, J. Rouat
  • Published 18 September 2019
  • Computer Science
  • Proceedings of the 14th International Audio Mostly Conference: A Journey in Sound
Searching through vast libraries of sound samples can be a daunting and time-consuming task. Modern audio sample browsers use mappings between acoustic properties and visual attributes to visually differentiate displayed items. There are few studies focused on the effect of these mappings on the time it takes to search for a specific sample. We designed a study to compare shape, colour and texture as visual labels in a known-item search task. We describe the motivation and implementation of the… 

Figures and Tables from this paper

A proposal and evaluation of new timbre visualization methods for audio sample browsers
TLDR
It is found that shape significantly improves task performance, while color and texture have little effect, and new methods for generating textural labels and positioning samples based on perceptual representations of timbre are proposed.

References

SHOWING 1-10 OF 27 REFERENCES
Universal Style Transfer via Feature Transforms
TLDR
The key ingredient of the method is a pair of feature transforms, whitening and coloring, that are embedded to an image reconstruction network that reflects a direct matching of feature covariance of the content image to a given style image.
AudioMetro: directing search for sound designers through content-based cues
TLDR
It is shown through user evaluations by known-item search in collections of textural sounds that a default grid layout ordered by filename unexpectedly outperforms content-based similarity layouts resulting from a recent dimension reduction technique.
Aurally and visually enhanced audio search with soundtorch
TLDR
An enhanced auditory and graphical zoomable user interface that leverages the human brain's capability to single out sounds from a spatial mixture that can be picked more quickly and/or with more fun than with standard file-by-file auditioning.
New Brodatz-Based Image Databases for Grayscale Color and Multiband Texture Analysis
TLDR
This paper used basic image processing tools to develop a new class of textures in which texture information is the only source of discrimination andSpectral information in this newclass of textures contributes only to form texture.
SoundTorch: Quick Browsing in Large Audio Collections
TLDR
User tests show that this method can leverage the human brain’s capability to single out sounds from a spatial mixture and enhance browsing in large collections of audio content.
Image Style Transfer Using Convolutional Neural Networks
TLDR
A Neural Algorithm of Artistic Style is introduced that can separate and recombine the image content and style of natural images and provide new insights into the deep image representations learned by Convolutional Neural Networks and demonstrate their potential for high level image synthesis and manipulation.
Audio information browsing with the Sonic Browser
  • E. Brazil, M. Fernstrom
  • Computer Science
    Proceedings International Conference on Coordinated and Multiple Views in Exploratory Visualization - CMV 2003 -
  • 2003
TLDR
The Sonic Browser is presented, an application for browsing sound collections on personal computers and it is described how a novel user interface with multiple views can be applied to the browsing of audio collections.
Visual information seeking: tight coupling of dynamic query filters with starfield displays
TLDR
New principles for visual information seeking are offered to support browsing, which is distinguished from familiar query composition and information retrieval because of its emphasis on rapid filtering to reduce result sets, progressive refinement of search parameters, continuous reformulation of goals, and visual scanning to identify results.
DESIGN AND EVALUATION OF A VISUALIZATION INTERFACE FOR QUERYING LARGE UNSTRUCTURED SOUND DATABASES
TLDR
The aim of this thesis is the design of a visualization interface that let users graphically define queries for the Freesound Project database and retrieve suitable results for a musical context.
Perceptual scaling of synthesized musical timbres: Common dimensions, specificities, and latent subject classes
TLDR
The model with latent classes and specificities gave a better fit to the data and made the acoustic correlates of the common dimensions more interpretable, suggesting that musical timbres possess specific attributes not accounted for by these shared perceptual dimensions.
...
1
2
3
...