Recognizing Detailed Human Context In-the-Wild from Smartphones and Smartwatches

Abstract

We demonstrate usage of smartphones and smartwatches to automatically recognize a person’s behavioral context in-the-wild. In our setup, subjects use their own personal phones and engage in regular behavior in their natural environments. Our system fuses complementary information from multi-modal sensors and simultaneously recognizes many contextual attributes from diverse behavioral domains, including work and leisure activities, body movement, transportation, and more. Our large scale labeled dataset is publicly available.

Extracted Key Phrases

2 Figures and Tables

Cite this paper

@article{Vaizman2016RecognizingDH, title={Recognizing Detailed Human Context In-the-Wild from Smartphones and Smartwatches}, author={Yonatan Vaizman and Katherine Ellis and Gert R. G. Lanckriet}, journal={CoRR}, year={2016}, volume={abs/1609.06354} }