• Corpus ID: 119297990

Situationally Induced Impairment in Navigation Support for Runners

  title={Situationally Induced Impairment in Navigation Support for Runners},
  author={Shreepriya Shreepriya and Danilo Gallo and Sruthi Viswanathan and Jutta Willamowski},
Mobile devices are ubiquitous and support us in a myriad of situations. In this paper, we study the support that mobile devices provide for navigation. It presents our findings on the Situational Induced Impairments and Disabilities (SIID) during running. We define the context of runners and the factors affecting the use of mobile devices for navigation during running. We discuss design implications and introduce early concepts to address the uncovered SIID issues. This work contributes to the… 

Figures from this paper

RunAhead: Exploring Head Scanning based Navigation for Runners
RunAhead, a navigation system using head scanning to query for navigation feedback, is designed and explored in an outdoor experiment, finding that demand and error are equivalent across all four conditions.
RunAhead: Providing Head Scanning based Navigation feedback
RunAhead, a navigation system providing head scanning based navigation feedback according to the runner's head scanning movement and his actual head direction, provides the runner with simple and intuitive feedback on the path s/he is looking at, highlighting the one to follow.
Understanding and supporting individuals experiencing severely constraining situational impairments
The revealed implications for design indicate that to maximize the user experience in the mobile device transaction space, designers must account for the presence of these SCSI and the unique design specifications that they require.


Challenges of situational impairments during interaction with mobile devices
It is argued that successful detection of the presence of a specific situational impairment is paramount before solutions can be proposed to adapt mobile interfaces to accommodate potential situational impairments.
Shoe me the Way: A Shoe-Based Tactile Interface for Eyes-Free Urban Navigation
This work presents Shoe me the Way, a novel tactile interface for eyes-free pedestrian navigation in urban environments that can be fully integrated into users' own, regular shoes without permanent modifications and has high usability.
ONTRACK: Dynamically adapting music playback to support navigation
An initial lab-based evaluation demonstrated the approach’s efficacy: users were able to complete tasks within a reasonable time and their subjective feedback was positive, and a handheld prototype was constructed, indicating that even with a low-fidelity realisation of the concept, users can quite effectively navigate complicated routes.
It's not just the light: understanding the factors causing situational visual impairments during mobile interaction
This paper identifies a range of factors causing SVIs, discusses mobile design implications, and introduces an SVI Context Model rooted in empirical evidence that will support the development of new effective SVI solutions.
WalkType: using accelerometer data to accomodate situational impairments in mobile touch screen text entry
WalkType is introduced, an adaptive text entry system that leverages the mobile device's built-in tri-axis accelerometer to compensate for extraneous movement while walking and uses the displacement and acceleration of the device, and inference about the user's footsteps to classification.
Vibrobelt: tactile navigation support for cyclists
Vibrobelt, a belt, worn around the waist, that gives waypoint, distance and endpoint information using directional tactile cues, was successful at guiding all participants to their destinations over an unfamiliar route and showed a lower error rate for recognizing images from the route.
GpsTunes: controlling navigation via audio feedback
We combine the functionality of a mobile Global Positioning System (GPS) with that of an MP3 player, implemented on a PocketPC, to produce a handheld system capable of guiding a user to their desired
Mobile Hand Gesture Toolkit: Co-Designing Mobile Interaction Interfaces
  • S. Stigberg
  • Art, Computer Science
    Conference on Designing Interactive Systems
  • 2017
A method to probe such interactions for and with runners using a participatory design approach and demonstrates in a pilot design workshop how participants can tell their mobile interaction story, make their own mobile hand gesture interface, and enact their story using their created artifacts.
NaviRadar: a novel tactile information display for pedestrian navigation
NaviRadar: an interaction technique for mobile phones that uses a radar metaphor in order to communicate the user's correct direction for crossings along a desired route and provides distinct advantages over current systems by using only tactile feedback.
No need to stop: exploring smartphone interaction paradigms while cycling
The analysis of the interaction movements and the group discussion showed that users preferred to keep their hands on the handlebars while performing subtle gestures with their fingers, which resulted in a significantly lower physical demand and significantly lower frustration compared to the other alternatives.