Learn More
Although Data Gloves allow for the modeling of the human hand, they can lead to a reduction in usability as they cover the entire hand and limit the sense of touch as well as reducing hand feasibility. As modeling the whole hand has many advantages (e.g. for complex gesture detection) we aim for modeling the whole hand while at the same time keeping the(More)
Tropical deforestation for the establishment of tree cash crop plantations causes significant alterations to soil organic carbon (SOC) dynamics. Despite this recognition, the current Intergovernmental Panel on Climate Change (IPCC) tier 1 method has a SOC change factor of 1 (no SOC loss) for conversion of forests to perennial tree crops, because of scarcity(More)
Most current devices are passive regarding their locations by being integrated in the environment or require to be carried when used in mobile scenarios. In this paper we present a novel type of self-actuated devices, which can be placed on vertical surfaces like whiteboards or walls. This enables vertical tangible interaction as well as the device(More)
This paper explores how microgestures can allow us to execute a secondary task, for example controlling m obile applications, without interrupting the manual primary task, for instance, driving a car. In order to design microgestures iteratively, we interviewed sports-and physiotherapists while asking them to use task related props, such as a steering(More)
Due to advances in technology large displays with very high resolution started to become affordable for daily work. Today it is possible to build display walls with a pixel density that is comparable to standard office screens. Previous work indicates that physical navigation enables a deeper engagement with the data set. In particular, the visibility of(More)
This paper focuses on combining front and back device interaction on grasped devices, using touch-based gestures. We designed generic interactions for discrete, continuous, and combined gesture commands that are executed without hand-eye control because the performing fingers are hidden behind a grasped device. We designed the interactions in such a way(More)
We present research that investigates the amount of guidance required by users for precise back-of-device interaction. We explore how pointing effectiveness is influenced by the presence or absence of visual guidance feedback. Participants were asked to select targets displayed on an iPad device, by touching and releasing them from underneath the device.(More)
Nowadays, mobile devices provide new possibilities for gesture interaction due to the large range of embedded sensors they have and their physical form factor. In addition, auditory interfaces can now be more easily supported through advanced mobile computing capabilities. Although different types of gesture techniques have been proposed for handheld(More)
Graphical user interfaces for mobile devices have several drawbacks in mobile situations. In this paper, we present Foogue, an eyes-free interface that utilizes spatial audio and gesture input. Foogue does not require visual attention and hence does not divert visual attention from the task at hand. Foogue has two modes, which are designed to fit the usage(More)