Tangible interfaces for real-time 3D virtual environments

@inproceedings{Mazalek2007TangibleIF,
  title={Tangible interfaces for real-time 3D virtual environments},
  author={Ali Mazalek and Michael Nitsche},
  booktitle={ACE '07},
  year={2007}
}
Emergent game formats, such as machinima, that use game worlds as expressive 3D performance spaces have new expressive powers with an increase of the quality of their underlying graphic and animation systems. Nevertheless, they still lack intuitive control mechanisms. Set direction and acting are limited by tools that were designed to create and play video games rather than produce expressive performance pieces. These tools do a poor job of capturing the performative expression that… 

Figures from this paper

Tangible Interface for Controlling Toys-To-Life Characters Emotions

This work presents a novel interface to be used in performative play that is composed by a physical toy with an attached sensor device, and enables users to control the emotions of a virtual character.

Experiments in the use of game technology for pre-visualization

This overview paper outlines the value of real-time 3D engines for pre-visualization, and argues that animation control and camera control are the two main areas that need to be addressed.

PRISME: Toward a model linking tangible interaction and domain-specific tasks in mixed reality environments

This research will be presented in two stages: a generic typology of interactors allowing to represent the majority of tangible and MR devices will be followed by an interaction model based on MASCARET’s activity description meta-model syntax, which will validate the model by simulating the standard operations performed by an airplane controller.

I'm in the game: embodied puppet interface improves avatar control

Results suggest that the embodied mapping between a player and avatar, provided by the puppet interface, leads to important performance advantages.

Engaging spect-actors with multimodal digital puppetry

It is found that the puppetry was truly multimodal, utilizing several input modalities simultaneously; the structure of sessions followed performative strategies; and the engagement of spectators was co-constructed.

Improving Interaction in HMD-Based Vehicle Simulators through Real Time Object Reconstruction

A system to generate a real time virtual reconstruction of real world user interface elements for use in a head mounted display based driving simulator that uses sensor fusion algorithms to combine data from depth and color cameras to generate an accurate, detailed, and fast rendering of the user's hands while using the simulator.

Tangible User Interfaces and Metaphors for 3D Navigation

A new domain and task independent 3D navigation metaphor, Navigational Puppetry, is presented, which is intended to be a candidate for the navigational portion of a unifying 3D interaction metaphor.

Giving your self to the game: transferring a player's own movements to avatars using tangible interfaces

It is concluded that players - if equipped with the appropriate interfaces - can indeed project and decipher their own body movements in a game character.

Massively Multiplayer Online Worlds as a Platform for Augmented Reality Experiences

It is demonstrated that MMOs also provide a powerful platform for Augmented Reality (AR) applications, where the authors blend together locations in physical space with corresponding places in the virtual world.

Embodying Self in Virtual Worlds

This chapter presents the results of two sets of self-recognition experiments that investigated the connections between player and virtual avatar, and demonstrates that an embodied interface for virtual character control that was designed based on common coding principles is effective in personalising a player’s ­avatar.

References

SHOWING 1-10 OF 43 REFERENCES

Puppet Show : Intuitive Puppet Interfaces for Expressive Character Control

Puppet Show is an interface plug-in for Epic’s Unreal Tournament 2004 that argues that this exemplifies an interface trend towards more expressive input options that support a higher level of expression in video games.

A Tangible Interface for High-Level Direction of Multiple Animated Characters

This work presents a tangible interface for basic character manipulation on planar surfaces and focuses on interface aspects specific to 2D gross character animation such as path and timing specification.

Game on: The History and Culture of Video Games

From the Publisher: Video games have come a long way since the first ever computer game, Spacewar, was developed at MIT in 1962 using technology developed to further man's attempts at space travel.

Tangible bits: towards seamless interfaces between people, bits and atoms

Tangible Bits allows users to "grasp & manipulate" bits in the center of users’ attention by coupling the bits with everyday physical objects and architectural surfaces and ambient media for background awareness.

Emerging frameworks for tangible user interfaces

The MCRpd interaction model for tangible interfaces is introduced, which relates the role of physical and digital representations, physical control, and underlying digital models to provide a foundation for identifying and discussing several key characteristics of tangible user interfaces.

Half-Real: Video Games between Real Rules and Fictional Worlds

A video game is half-real: we play by real rules while imagining a fictional world. We win or lose the game in the real world, but we slay a dragon (for example) only in the world of the game. In

First Person: New Media As Story, Performance, And Game

The editors of First Person have gathered a remarkably diverse group of new media theorists and practitioners to consider the relationship between "story" and "game," as well as the new kinds of artistic creation that have become possible in the digital environment.

A design method for “whole-hand” human-computer interaction

This is a series of procedures that enumerates key issues and points for consideration in the development of whole-hand input that helps designers focus on task requirements, isolate problem areas, and choose appropriate whole- hand input strategies for their specified tasks.

A survey of glove-based input

This work provides a basis for understanding the field by describing key hand-tracking technologies and applications using glove-based input, and presents a cross-section of the field to date.

When the interface is a talking dinosaur: learning across media with ActiMates Barney

The theory and practice behind Barney’s performance in each mode (freestanding, with the computer, and with the television) are described, as well as how key research results shaped the interface across the different modes.