• Publications
  • Influence
In the blink of an eye: combining head motion and eye blink frequency for activity recognition with Google Glass
TLDR
We demonstrate how information about eye blink frequency and head motion patterns derived from Google Glass sensors can be used to distinguish different types of high level activities. Expand
  • 111
  • 6
  • PDF
Symbolic Object Localization Through Active Sampling of Acceleration and Sound Signatures
TLDR
We describe a novel method for symbolic location discovery of simple objects that uses vibration and short, narrow frequency 'beeps' to sample the response of the environment to mechanical stimuli. Expand
  • 40
  • 6
  • PDF
Where am I: Recognizing On-body Positions of Wearable Sensors
TLDR
The paper describes a method that allows us to derive the location of an acceleration sensor placed on the user's body solely based on the sensor's signal. Expand
  • 132
  • 5
Dealing with sensor displacement in motion-based onbody activity recognition systems
TLDR
We present a set of heuristics that significantly increase the robustness of motion sensor-based activity recognition with respect to sensor displacement. Expand
  • 143
  • 5
  • PDF
Using Wearable Sensors for Real-Time Recognition Tasks in Games of Martial Arts - An Initial Experiment
TLDR
Beside their stunning graphics, modern entertainment systems feature ever-higher levels of immersive user-interaction. Expand
  • 98
  • 5
  • PDF
Using acceleration signatures from everyday activities for on-body device location
  • K. Kunze, P. Lukowicz
  • Computer Science
  • 11th IEEE International Symposium on Wearable…
  • 11 October 2007
TLDR
This paper is part of an effort to facilitate wearable activity recognition using dynamically changing sets of sensors integrated in everyday appliances such as phones, PDAs, watches, headsets etc. Expand
  • 44
  • 5
  • PDF
Sensor Placement Variations in Wearable Activity Recognition
TLDR
This article explores how placement variations in user-carried electronic appliances influence human action recognition and how such influence can be mitigated. Expand
  • 59
  • 4
Which Way Am I Facing: Inferring Horizontal Device Orientation from an Accelerometer Signal
TLDR
We present a method to infer the orientation of mobile device carried in a pocket from the acceleration signal acquired when the user is walking. Expand
  • 87
  • 4
  • PDF
Facial Expression Recognition in Daily Life by Embedded Photo Reflective Sensors on Smart Eyewear
TLDR
This paper presents a novel smart eyewear that uses embedded photo reflective sensors and machine learning to recognize a wearer's facial expressions in daily life. Expand
  • 44
  • 3
Smarter eyewear: using commercial EOG glasses for activity recognition
TLDR
We present a first evaluation of soon commercially available Electrooculography (EOG) glasses (J!NS MEME) for the use in activity recognition. Expand
  • 42
  • 3
  • PDF