Ocarina: Designing the iPhone's Magic Flute

  title={Ocarina: Designing the iPhone's Magic Flute},
  author={Ge Wang},
  journal={Computer Music Journal},
  • Ge Wang
  • Published 1 June 2014
  • Art
  • Computer Music Journal
Ocarina, created in 2008 for the iPhone, is one of the first musical artifacts in the age of pervasive, app-based mobile computing. It presents a flute-like physical interaction using microphone input, multi-touch, and accelerometers—and a social dimension that allows users to listen in to each other around the world. This article chronicles Smule's Ocarina as a mobile musical experiment for the masses, examining in depth its design, aesthetics, physical interaction, and social interaction, as… 

Augmenting the iPad: the BladeAxe

The BladeAxe is an iPad-based musical instrument leveraging the concepts of “augmented mobile device” and “hybrid physical model controller” to provide the performer with an extended expressive potential compared to a standard controller.

Extending a Standard Tablet into an Ocarina Playing Gaming Platform with Projected Parallax Layers

The proposed clip-on contraption was designed and custom-made to fit a standard 10" tablet and a 3D Role Playing Game utilizing the 4 hole ocarina fingering scheme as input was developed as a gamification platform for learning how to play the ocarine.

Percussionist-Centred Design for Touchscreen Digital Musical Instruments

This article describes how percussive interaction informed the design, development, and deployment of a series of touchscreen digital musical instruments for ensembles. Percussion has previously been

Microjam: an app for sharing tiny touch-screen performances

MicroJam is a mobile app for sharing tiny touch-screen performances that encourage users not only to perform music more frequently, but to engage with others in impromptu ensemble music making.

Data-Driven Analysis of Tiny Touchscreen Performance with MicroJam

The analysis shows that users tend to focus on the center and diagonals of the touchscreen area, and that they tend to swirl or swipe rather than tap, which enhances the understanding of how users perform in touchscreen apps and could be applied in future app designs for social musical interaction.

Auraglyph: Handwritten Computer Music Composition and Design

A new model for computer music design on touchscreen devices, combining both pen/stylus input and multitouch gestures is proposed, which surpasses the barrier of touchscreen-based keyboard input and preserves the primary interaction of touch and direct manipulation throughout the development of a complex musical program.

miniAudicle for iPad: Touchscreen-based Music Software Programming

A new software application for ChucK programming and performance on mobile touchscreen devices, miniAudicle for iPad, provides a textual code Editor mode optimized for touchscreen typing, a live-codingoriented Player mode, and collaborative network performance via a Connect mode.

Searching for Gesture and Embodiment in Live Coding

A reconsideration of text’s current seat as the de facto universal medium for programming code is proposed and alternative paths for the fusion of gesture and live coding are indicated.

Exploring Social Mobile Music with Tiny Touch-Screen Performances

The app takes inspiration from popular social media applications, such as a timeline of contributions from other users, deliberately constrained creative contributions, and the concept of a reply, to emphasise frequent and casual musical performance.

Sound-Based Sensors for NIMEs

This paper examines the use of Sound Sensors and audio as input material for New Interfaces for Musical Expression (NIMEs), exploring the unique affordances and character of the interactions and



Evolving The Mobile Phone Orchestra

The origins of MoPhO are traced, the motivations behind the current hardware and software design in relation to the backdrop of current trends in mobile music making are described, key interaction concepts around new repertoire are detail, and an analysis on the development of Mo PhO thus far is concluded.

Smule = Sonic Media: An Intersection of the Mobile, Musical, and Social

The potential for and implications of musical (or proto-musical) social interaction and collaboration using currently available technologies embedded into mobile phones and a series of commercial iPhone applications that each introduces a unique aspect of sonicbased social community building are explored.

Stanford Laptop Orchestra (SLOrk)

The instantiation and adventures of the Stanford Laptop Orchestra (SLOrk), an ensemble of laptops, humans, hemispherical speaker arrays, interfaces, and, more recently, mobile smart phones, are chronicle.

Do Mobile phones Dream of Electric Orchestras?

The motivation and making of MoPhO, the first repertoireand ensemblebased mobile phone performance group of its kind, is described and the ensemble demonstrates that mobile phones orchestras are interesting technological and artistic platforms for electronic music composition and performance.

Net_Dérive: Conceiving and Producing a Locative Media Artwork

The rapid uptake of mobile telephony, the high bandwidth network access afforded by 3G networks, and the increasingly powerful multimedia capabilities of modern mobile handsets have created a

Don't forget the laptop: using native input capabilities for expressive musical control

It is argued that instruments designed using these built-in inputs offer benefits over custom standalone controllers, particularly in certain group performance settings, and a new toolkit for rapidly experimenting with these capabilities is described.

Shamus - a Sensor-Based Integrated Mobile phone Instrument

ShaMus is a sensor-based approach to turning mobile devices into musical instruments that allows individual and untethered gestural performance through striking, shaking and sweeping gestures.

The World Is Your Stage

This presentation explores the research the authors are doing at Stanford and at Smule: social/mobile music, the ChucK audio programming language, as well as laptop and mobile phone orchestras in an intersection of music, computer science, and the simple joy of building things together.

Audience-Participation Techniques Based on Social Mobile Computing

This paper presents techniques for enabling audience participation based primarily on using smartphones, as experimented by the Stanford Mobile Phone Orchestra, and evaluates these techniques and considers the future of social music interactions aided by mobile technology.

Sonic City: The Urban Environment as a Musical Interface

The project Sonic City develops a system that enables users to create electronic music in real time by walking through and interacting with the urban environment and produces music by retrieving information about context and user action and mapping it to real-time processing of urban sounds.