Visfer: Camera-based visual data transfer for cross-device visualization

@article{Badam2019VisferCV,
  title={Visfer: Camera-based visual data transfer for cross-device visualization},
  author={Sriram Karthik Badam and Niklas Elmqvist},
  journal={Information Visualization},
  year={2019},
  volume={18},
  pages={68 - 93}
}
Going beyond the desktop to leverage novel devices—such as smartphones, tablets, or large displays—for visual sensemaking typically requires supporting extraneous operations for device discovery, interaction sharing, and view management. Such operations can be time-consuming and tedious and distract the user from the actual analysis. Embodied interaction models in these multi-device environments can take advantage of the natural interaction and physicality afforded by multimodal devices and… 
PolyVis: Cross-Device Framework for Collaborative Visual Data Analysis
TLDR
This work presents a framework developed on top of SAGE2 platform for cross-device collaborative visual data exploration that provides the users with an environment for visualization compositions that delegate the rendering to the target device, allowing them to augment their large display workspace with portable devices for further exploration territories.
When David Meets Goliath: Combining Smartwatches with a Large Vertical Display for Visual Data Exploration
TLDR
A conceptual framework is proposed to enable analysts to explore data items, track interaction histories, and alter visualization configurations through mechanisms using both devices in combination to support visual data analysis.
Effects of screen-responsive visualization on data comprehension
TLDR
A visual interface with a coordinated multiple view layout for the large display and two alternative designs of the same interface – a space-saving boundary visualization layout and an overview layout – for the mobile condition are developed, providing new guidelines for screen-responsive visualization interfaces.
One View Is Not Enough: Review of and Encouragement for Multiple and Alternative Representations in 3D and Immersive Visualisation
TLDR
The aim of this paper is to provide a set of concepts that will enable developers to perform critical thinking, creative thinking and push the boundaries of what is possible with 3D and immersive visualisation.
Composites: A Tangible Interaction Paradigm for Visual Data Analysis in Design Practice
TLDR
This work introduces Composites, a tangible, augmented reality interface for constructing visualizations on large surfaces and develops mechanisms (sticky interactions, visual hinting, etc.) to provide guiding feedback to the end-user.
AniCode: authoring coded artifacts for network-free personalized animations
TLDR
A new framework for authoring and consuming time-based media is introduced that provides personalized animations that only decode in the intended context and is designed to be low cost and easy to use.
Designing for Mobile and Immersive Visual Analytics in the Field
TLDR
This paper uses a design probe coupling mobile, cloud, and immersive analytics components to guide interviews with ten experts from five domains to explore how visual analytics could support data collection and analysis needs in the field.
Network-Free and In-Context Animations through Printed Codes
TLDR
A new framework for authoring and consuming time-based media is introduced that provides personalized animations that only decode in the intended context and is designed to be low cost and easy to use.
Vistrates: A Component Model for Ubiquitous Analytics
TLDR
This paper presents a component model design for data visualization to promote modular designs of visualization tools that enhance their analytical scope, and introduces Vistrates, a literate computing platform for developing, assembling, and sharing visualization components.
Multiple Coordinated Views at Large Displays for Multiple Users: Empirical Findings on User Behavior, Movements, and Distances
TLDR
It is argued that for future systems interaction from the distance is required and needs good support and the consistent design for both direct touch at the large display and distant interaction using mobile phones enables the seamless exploration of large-scale MCV at wall-sized displays.
...
...

References

SHOWING 1-10 OF 75 REFERENCES
Tangible views for information visualization
TLDR
A number of interaction and visualization patterns for tangible views that constitute the vocabulary for performing a variety of common visualization tasks are introduced and suggest the high potential of this novel approach to support interaction with complex visualizations.
VisPorter: facilitating information sharing for collaborative sensemaking on multiple displays
TLDR
A visual analytics system, VisPorter, developed for use in a multiple display and device environment, and a user study that explores the usage and benefits of this system are presented.
Supporting visual exploration for multiple users in large display environments
TLDR
A design space exploration of interaction techniques for supporting multiple collaborators exploring data on a shared large display indicates that users favor implicit interaction through proxemics for navigation and collaboration, but prefer using explicit mid-air gestures to perform actions that are perceived to be direct.
Embodied lenses for collaborative visual queries on tabletop displays
TLDR
Results show that embodied lenses are as efficient as purely virtual lenses, and also support tactile and eyes-free interaction, and are integrated into many existing tabletop displays.
Hybrid-Image Visualization for Large Viewing Environments
TLDR
A first investigation into hybrid-image visualization for data analysis in large-scale viewing environments by using a perception-based blending approach, to make two full-screen visualizations accessible without tracking viewers in front of a display.
Information Visualization and Proxemics: Design Opportunities and Empirical Findings
TLDR
This work implements interaction techniques that zoom and pan, query and relate, and adapt visualizations based on tracking of users' position in relation to a large high-resolution display, focusing here on the spatial relations between a single user and visualizations on a large display.
Conductor: enabling and understanding cross-device interaction
TLDR
Conductor, a prototype framework which serves as an exemplar for the construction of cross-device applications, is presented and a series of interaction methods by which users can easily share information, chain tasks across devices, and manage sessions across devices are presented.
Collaborative visualization: Definition, challenges, and research agenda
TLDR
The purpose of this article is to help pinpoint the unique focus of collaborative visualization with its specific aspects, challenges, and requirements within the intersection of general computer-supported cooperative work and visualization research, and to draw attention to important future research questions to be addressed by the community.
Visual encodings that support physical navigation on large displays
TLDR
If and how the choice of visual encodings for large, high-resolution visualizations affects physical navigation, and ultimately task performance for a spatial information visualization task is analyzed.
...
...