Form Follows Sound: Designing Interactions from Sonic Memories

  title={Form Follows Sound: Designing Interactions from Sonic Memories},
  author={Baptiste Caramiaux and Alessandro Altavilla and Scott G. Pobiner and Atau Tanaka},
  journal={Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems},
Sonic interaction is the continuous relationship between user actions and sound, mediated by some technology. Because interaction with sound may be task oriented or experience-based it is important to understand the nature of action-sound relationships in order to design rich sonic interactions. We propose a participatory approach to sonic interaction design that first considers the affordances of sounds in order to imagine embodied interaction, and based on this, generates interaction models… 

Figures from this paper

Designing from listening: embodied experience and sonic interactions
This thesis investigates how to draw upon people’s everyday sonic experience to design interactions using body movement, digital sound processing and embodied technologies, and provides four main contributions which help to build a framework for design that addresses lesser-explored matters in SID.
Embodied Musical Interaction - Body Physiology, Cross Modality, and Sonic Experience
  • A. Tanaka
  • Art
    New Directions in Music and Human-Computer Interaction
  • 2019
This chapter discusses how the concept of embodied interaction can be one way to think about music interaction and proposes how the three “paradigms” of HCI and three design accounts from the interaction design literature can serve as a lens through which to consider types of music HCI.
Embodied Musical Interaction Body Physiology, Cross Modality, and Sonic Experience
Music is a natural partner to human-computer interaction, offering tasks and use cases for novel forms of interaction. The richness of the relationship between a performer and their instrument in
Interaction by ear
Sketching sonic interactions by imitation-driven sound synthesis
The integration of these two software packages provides an environment in which sound designers can go from concepts, through exploration and mocking-up, to prototyping in sonic interaction design, taking advantage of all the possibilities offered by vocal and gestural imitations in every step of the process.
From Ecological Sounding Artifacts Towards Sonic Artifact Ecologies
The discipline of sonic interaction design has been focused on the interaction between a single user and an artifact. This strongly limits one of the fundamental aspects of music as a social and
Characterising Soundscape Research in Human-Computer Interaction
‘Soundscapes’ are an increasingly active topic in Human-Computer Interaction (HCI) and interaction design. From mapping acoustic environments through sound recordings to designing compositions as
Interactive Machine Learning for Embodied Interaction Design: A tool and methodology
A 5-day hackathon is proposed, bringing together artists, dancers and designers, to explore designing movement interaction and create prototypes using new interactive machine learning tool InteractML.
Material embodiments of electroacoustic music: an experimental workshop study
A workshop where participants produced physical mock-ups of musical interfaces directly after miming control of short electroacoustic music pieces indicated that a relevant number of participants intuitively decided to engineer alternative solutions emphasizing their personal design preferences.
NIME or Mime: A Sound-First Approach to Developing an Audio-Visual Gestural Instrument
This paper outlines the development process of an audiovisual gestural instrument—the AirSticks—and elaborates on the role ‘miming’ has played in the formation of new mappings for the instrument. The


Sonic Interaction Design
Sound Embodied: Explorations of Sonic Interaction Design for Everyday Objects in a Workshop Setting
We describe an emergent field of considerable relevance to the auditory display community – that of sonic interaction design for everyday artifacts. It is positioned at the intersection of auditory
Designing Continuous Sonic Interaction
Continuous interaction and multisensory feedback present tremendous challenges to designers who are mostly educated along the lines of visual thinking and discrete interactions. For this
Participatory workshops : everyday objects and sound metaphors
The Legos project aims at studying sound gesture relationship, and specifically, how sound can affect a sensory-motor learning process. During the project, two participatory workshops related to
Integrating Theatrical Strategies into Sonic Interaction Design
The process of creating the sound design for a short theatre scene and the process of directing and creating the final performance, which involved a high degree of improvisation, are described.
ABSTRACTThis workshop aims to introduce ICAD participants to the use ofcreative interaction design methods when exploring the design ofsonic interactions with computational artefacts. Specifically,
Using vocal sketching for designing sonic interactions
Vocal Sketching is proposed as a methodology for addressing sounding design, alleviating the challenges inherent for non-experts when thinking and communicating about sound and sounding objects in the early stages of design.
The SonicFinder: An Interface That Uses Auditory Icons
This work discusses sound effects and source metaphors as methods of extending auditory icons beyond the limitations implied by literal mappings, and speculation on future directions for such interfaces is speculated.
Using a systematic design process to investigate narrative sound design strategies for interactive commodities
Qualitative considerations are described and a structure of a revisable, design oriented, participatory research process is outlined, which allows to explore narrative sound designs and their possible application in interactive commodities in a systematic yet explorative way.
Mapping Through Listening
An approach that brings forward the perception–action loop as a fundamental design principle for gesture–sound mapping in digital music instrument and makes use of machine-learning techniques for building prototypes, from digital music instruments to interactive installations.