A multimodal augmented reality DJ music system

@article{Farbiz2007AMA,
  title={A multimodal augmented reality DJ music system},
  author={F. Farbiz and K. Tang and C. Manders and Chong Jyh Herng and Y. Tan and Kejian Wang and W. Ahmad},
  journal={2007 6th International Conference on Information, Communications \& Signal Processing},
  year={2007},
  pages={1-5}
}
  • F. Farbiz, K. Tang, +4 authors W. Ahmad
  • Published 2007
  • Computer Science
  • 2007 6th International Conference on Information, Communications & Signal Processing
One of the goals of human-computer interaction is to utilize more intuitive and natural methods for communication. This goal has become of particular importance in augmented reality (AR) applications. Developing such interaction techniques is critical since traditional interfaces (keyboard, mouse) are cumbersome and awkward in AR environments. This paper provides a description of a real-time interactive multimodal augmented reality entertainment system. The system design is based on the DJ… Expand
Virtual piano with real-time interaction using automatic marker detection
TLDR
The experimental results demonstrated that the error of the playback sounds is only 0.5 percent and all errors are because of the wrong positions of the player's hand as some of his fingers accidentally covered over the unexpected markers during playing the virtual piano. Expand
LINEAR (LIVE-GENERATED INTERFACE AND NOTATION ENVIRONMENT IN AUGMENTED REALITY)
Recent developments in Augmented Reality (AR) technology are opening up new modes of representation and interaction with virtual objects; at the same time, increase in processing power of portableExpand
Realidade Misturada: Conceitos, Ferramentas e Aplicações
The Mixed Reality proposes scenes combining between virtual and real worlds offering to the user an intuitive way of interaction according to a specific application. This tutorial paper aims atExpand

References

SHOWING 1-10 OF 12 REFERENCES
Speech and Gesture Multimodal Control of a Whole Earth 3D Visualization Environment
TLDR
This implementation shows that such a multimodal interface that integrates speech and hand gestures can be effective in a real environment and sets some parameters for the design and use of such interfaces. Expand
Masterpiece: physical interaction and 3D content-based search in VR applications
TLDR
This work integrated Masterpiece into a new authoring tool for designers and engineers that uses 3D search capabilities to access original database content, supporting natural human-computer interaction. Expand
Multimodal interaction with a wearable augmented reality system
TLDR
It is shown how the input channels are integrated to use the modalities beneficially and how this enhances the interface's overall usability. Expand
On the Relationships Among Speech, Gestures, and Object Manipulation in Virtual Environments: Initial Evidence
This chapter reports on a study whose goal was to investigate how people make use of gestures and spoken utterances while playing a videogame without the support of standard input devices. We deployExpand
Mutual disambiguation of 3D multimodal interaction in augmented and virtual reality
TLDR
An approach to 3D multimodal interaction in immersive augmented and virtual reality environments that accounts for the uncertain nature of the information sources and fuses symbolic and statistical information from a set of 3D gesture, spoken language, and referential agents is described. Expand
Virtual object manipulation on a table-top AR environment
TLDR
This work describes an accurate vision-based tracking method for table-top AR environments and tangible user interface (TUI) techniques based on this method that allow users to manipulate virtual objects in a natural and intuitive manner. Expand
Toward multimodal human-computer interface
TLDR
It is clear that further research is needed for interpreting and fitting multiple sensing modalities in the context of HCI and the fundamental issues in integrating them at various levels, from early signal level to intermediate feature level to late decision level. Expand
“Put-that-there”: Voice and gesture at the graphics interface
  • R. Bolt
  • Computer Science
  • SIGGRAPH '80
  • 1980
TLDR
The work described herein involves the user commanding simple shapes about a large-screen graphics display surface, and because voice can be augmented with simultaneous pointing, the free usage of pronouns becomes possible, with a corresponding gain in naturalness and economy of expression. Expand
Surviving on Planet CCRMA, two Years Later and Still alive
TLDR
This presentation will outline the changes that have happened in the Planet over the past two years, focusing on the evolution of the linux kernel that is part of Planet CCRMA. Expand
Method and Apparatus for Voice Annotation and Retrieval of Multimedia Data
  • US Patent
  • 2002
...
1
2
...