Usable gestures for mobile interfaces: evaluating social acceptability

@article{Williamson2010UsableGF,
  title={Usable gestures for mobile interfaces: evaluating social acceptability},
  author={Julie Rico Williamson and Stephen A. Brewster},
  journal={Proceedings of the SIGCHI Conference on Human Factors in Computing Systems},
  year={2010}
}
  • J. Williamson, S. Brewster
  • Published 10 April 2010
  • Computer Science
  • Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Gesture-based mobile interfaces require users to change the way they use technology in public settings. Since mobile phones are part of our public appearance, designers must integrate gestures that users perceive as acceptable for pub-lic use. This topic has received little attention in the litera-ture so far. The studies described in this paper begin to look at the social acceptability of a set of gestures with re-spect to location and audience in order to investigate possi-ble ways of… 
Gesture and voice prototyping for early evaluations of social acceptability in multimodal interfaces
Interaction techniques that require users to adopt new behaviors mean that designers must take into account social acceptability and user experience otherwise the techniques may be rejected by users
Are you comfortable doing that?: acceptance studies of around-device gestures in and for public settings
TLDR
Recommendations for around-device input designers are presented and new approaches for evaluating the social acceptability of novel input methods are suggested.
Multimodal mobile interactions: usability studies in real world settings
TLDR
Users were far more comfortable gesturing on the street than on public transport, which was reflected in the number of interactions and the perceived social acceptability of the gestures in different contexts.
Mo!Games: evaluating mobile gestures in the wild
The user experience of performing gesture-based interactions in public spaces is highly dependent on context, where users must decide which gestures they will use and how they will perform them. In
Gesture-Based Interfaces: Practical Applications of Gestures in Real World Mobile Settings
TLDR
The practical implications of creating gesture recognition using accelerometer based sensing given the challenges of gesturing in mobile situations are discussed and a discussion of body-based interactions and the scenarios where these might be used successfully are discussed.
Touching the void: gestures for auditory interfaces
TLDR
This study proposes a novel approach to the problem of gesture acceptance by studying the design space of gestures proposed by end-users for a mobile auditory interface, and delivers some initial gestures recommendations for eyes-free auditory interfaces.
An exploratory study of user-generated spatial gestures with social mobile devices
TLDR
The results show that Scan, Swing, Nod and Turn the screen down are potential spatial gestures for intuitive use of Social Devices.
User-defined gestures for connecting mobile phones, public displays, and tabletops
TLDR
The results suggest that phone gestures have the potential to be easily understood by end users and that certain device configurations and activities may be well suited for gesture control.
Towards usable and acceptable above-device interactions
TLDR
An initial gesture collection, a preliminary evaluation of these gestures and some design recommendations will help designers create better gesture interfaces and identify interesting areas for future research.
Gesture-Based User Interfaces for Public Spaces
TLDR
This paper will summarize and evaluate the particular aspects of using gesture-based interfaces in application contexts in public and semipublic spaces.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 29 REFERENCES
Lexical Gesture Interface
  • Zhenyao Mo, U. Neumann
  • Computer Science
    Fourth IEEE International Conference on Computer Vision Systems (ICVS'06)
  • 2006
TLDR
This paper presents a framework for automatically producing a gesture interface based on a simple interface description, which describes and recognize gestures in a "lexical" space, in which each hand pose is decomposed into elements in a finger-pose alphabet.
Perceptions of Mobile Phone Use in Public Settings: A Cross-Cultural Comparison
This study entailed a cross-cultural comparison of perceptions of mobile phone use in select public settings, including a movie theater, restaurant, bus, grocery store, classroom, and sidewalk. A
Tap input as an embedded interaction method for mobile devices
TLDR
A novel interaction method for interacting with mobile devices without the need to access a keypad or a display is described, using the tap to activate logically similar functionalities on the device, leading to a simple but useful interaction method.
Shake2Talk: Multimodal Messaging for Interpersonal Communication
TLDR
This paper explores the possibilities of using audio and haptics for interpersonal communication via mobile devices, and presents the background to this work, the system design and implementation and a plan for evaluation.
Designing the spectator experience
TLDR
Public interfaces are classified according to the extent to which a performer's manipulations of an interface and their resulting effects are hidden, partially revealed, fully revealed or even amplified for spectators.
“Put-that-there”: Voice and gesture at the graphics interface
  • R. Bolt
  • Computer Science
    SIGGRAPH '80
  • 1980
TLDR
The work described herein involves the user commanding simple shapes about a large-screen graphics display surface, and because voice can be augmented with simultaneous pointing, the free usage of pronouns becomes possible, with a corresponding gain in naturalness and economy of expression.
Shoogle: excitatory multimodal interaction on mobile devices
Shoogle is a novel, intuitive interface for sensing data withina mobile device, such as presence and properties of textmessages or remaining resources. It is based around activeexploration: devices
Computer Vision for Human–Machine Interaction: A Framework for Gesture Generation and Interpretation
TLDR
A testbed for this framework for the generation and interpretation of spontaneous gesture in the context of speech is presented in the form of a program that generates speech, gesture, and facial expression from underlying rules specifying what speech and gesture are generated on the basis of a given communicative intent.
Wrist rotation for interaction in mobile contexts
TLDR
Results show correlations in movement time and the Index of Difficulty of the task and similarities in the targeting performance for the first three conditions, but show walking and targeting using this method was significantly more difficult.
...
1
2
3
...