Blended Interaction Design: A Spatial Workspace Supporting HeI and Design Practice*
- Florian Geyer
When the user interface (UI) has to be specified, a picture is worth a thousand words, and the worst thing one can do is attempt to write a natural language specification for it. Nevertheless, this practice is still common, and it is therefore a difficult task to move from text-based requirements and problem-space concepts to a final UI design, and then back again. Especially for the specification of interactive UIs, however, actors must frequently switch between high-level descriptions and detailed screens. In our research we found that advanced UI specifications therefore have to be made up of interconnected artefacts that have distinct levels of abstraction. With regards to the transparency and traceability of the rationale of the UI specification, transitions and dependencies must be visual and traversable. We introduce a modelbased UI specification method that interactively integrates interdisciplinary and informal modelling languages with different fidelities of UI prototyping to an interactive design rationale. With an innovative experimental tool we assemble models and design to an interactive UI specification. With a zoomable user interface (ZUI) approach, we can visualize the modelled artefacts and the overall UI specification space on desktop computers as well as on megapixel displays.