Efficient human-computer-interaction is a key to success for navigation systems, in particular when pedestrians are using them. Due to the increasing computational power of recent mobile devices, complex multimedia user interfaces to pedestrian navigation systems can be implemented. In order to be able to provide the best-suited interface to each user, we present a user study comparing not only three map presentation modes (bird's eye, egocentric and a combined one), but also involving the users' sense of direction as a second independent factor. In the experiment conducted, we did not focus on a global navigation task, but on the repeated subtask of locating objects on the map. ANOVA analysis of the task completion time revealed a significant interaction effect of presentation mode and the sense of direction of the test persons. Consequently, we advocate user-adaptive presentation modes for pedestrian navigation systems.
Unfortunately, ACM prohibits us from displaying non-influential references for this paper.
To see the full reference list, please visit http://dl.acm.org/citation.cfm?id=2541841.