Oussama Metatla

Learn More
Although research on non-visual access to visually represented information is steadily growing, very little work has investigated how such forms of representation could be constructed through non-visual means. We discuss in this paper our approach for providing audio access to relational diagrams using multiple perspective hierarchies, and describe the(More)
An approach to designing hierarchy-based auditory displays that supports non-visual interaction with relational diagrams is presented. The approach is motivated by an analysis of the functional and structural properties of relational diagrams in terms of their role as external representations. This analysis informs the design of a multiple perspective(More)
This paper describes an approach to support non-visual exploration of graphically represented information. We used a hierarchical structure to organize the information encoded in a relational diagram and designed two alternative audio-only interfaces for presenting the hierarchy, each employing different levels of verbosity. We report on an experimental(More)
We address the challenge of supporting collaborators who access a shared interactive space through different sets of modalities. This was achieved by designing a cross-modal tool combining a visual diagram editor with auditory and haptic views to allow simultaneous visual and non-visual interaction. The tool was deployed in various workplaces where(More)
We present a detailed description of the design and integration of auditory and haptic displays in a collaborative diagram editing tool to allow simultaneous visual and non-visual interaction. The tool was deployed in various workplaces where visually-impaired and sighted coworkers access and edit diagrams as part of their daily jobs. We use our initial(More)
We describe the design of a collaborative cross-modal tool that supports visually-impaired and sighted coworkers to access and edit shared diagrams in real time and a case study of its use in a real world workplace environment. Our findings highlight the potential of cross-modal collaboration to improve workplace inclusion and identify initial challenges(More)
The questions involved in the design of an interactive, audio only computer-based football game are explored. The game design process starts by exploring basic questions such as size of playing area, orientation, awareness of team mates and opponents and basic navigation. The project goes on to explore more advanced design issues, not addressed by previous(More)
We present an approach that examines the design of auditory displays for accessing graphically represented information in terms of their roles as external representations. This approach describes how a cross-modal translation process should emphasise the semantics of the represented information rather than the structural features of the medium that presents(More)
Research has suggested that adding contextual information such as reference markers to data sonification can improve interaction with auditory graphs. This paper presents results of an experiment that contributes to quantifying and analysing the extent of such benefits for an integral part of interpreting graphed data: point estimation tasks. We examine(More)