• Publications
  • Influence
Tactons: Structured Tactile Messages for Non-Visual Information Display
TLDR
A new form of tactile output, Tactons, is described, which are structured, abstract messages that can be used to communicate messages non-visually, particularly where the visual display is overloaded, limited in size or not available.
Overcoming the Lack of Screen Space on Mobile Computers
  • S. Brewster
  • Computer Science
    Personal and Ubiquitous Computing
  • 5 January 2002
TLDR
Two formal experiments that investigate the usability of sonically-enhanced buttons of different sizes show that sound can be beneficial for usability and that care must be taken to do testing in realistic environments to get a good measure of mobile device usability.
Usable gestures for mobile interfaces: evaluating social acceptability
TLDR
The studies described in this paper begin to look at the social acceptability of a set of gestures with re-spect to location and audience in order to investigate ways of measuring socialacceptability.
A first investigation into the effectiveness of Tactons
TLDR
The results of this experiment showed that Tactons could be a successful means of communicating information in user interfaces, with an overall recognition rate of 71%, and recognition rates of 93% for rhythm and 80% for roughness.
A Dose of Reality: Overcoming Usability Challenges in VR Head-Mounted Displays
TLDR
Usability challenges facing consumers adopting Virtual Reality (VR) head-mounted displays (HMDs) are identified in a survey of 108 VR HMD users, and it is shown that augmenting VR with a view of reality significantly corrected the performance impairment of typing in VR.
Providing a Structured Method for Integrating Non-Speech Audio into Human-Computer Interfaces
TLDR
The structured method for integrating sound into interfaces was shown to be effective when applied to existing interface widgets and Experimental results showed that sound could improve usability by increasing performance, reducing time to recover from errors and reducing workload.
Multimodal feedback for the acquisition of small targets
TLDR
Analysis of how multimodal feedback assists small-target acquisition in graphical user interfaces shows that for small, discretely located targets all feedback modes reduce targeting times, with stickiness providing substantial improvements.
Multimodal 'eyes-free' interaction techniques for wearable devices
TLDR
Two multimodal interaction techniques designed to overcome problems and allow truly mobile, 'eyes-free' device use are presented, one of which is a 3D audio radial pie menu that uses head gestures for selecting items.
Multidimensional tactons for non-visual information presentation in mobile devices
TLDR
This study investigates recognition rates for Tactons which encode a third dimension of information using spatial location and shows that identification rate for three-parameter Tactons is just 48, but that this can be increased to 81 by reducing the number of values of one of the parameters.
Gestural and audio metaphors as a means of control for mobile devices
TLDR
The use of gesture and non-speech audio as ways to improve the user interface of a mobile music player are discussed, showing significant usability improvements for the gesture/audio-based interface over a standard visual/pen-based display.
...
1
2
3
4
5
...