A multimodal data set for evaluating continuous authentication performance in smartphones

Abstract

Continuous authentication modalities allow a device to authenticate users transparently without interrupting them or requiring their attention. This is especially important on smartphones, which are more prone to be lost or stolen than regular computers, and carry plenty of sensitive information. There is a multitude of signals that can be harnessed for continuous authentication on mobile devices, such as touch input, accelerometer, and gyroscope, etc. However, existing public datasets include only a handful of them, limiting the ability to do experiments that involve multiple modalities. To fill this gap, we performed a large-scale user study to collect a wide spectrum of signals on smartphones. Our dataset combines more modalities than existing datasets, including movement, orientation, touch, gestures, and pausality. This dataset has been used to evaluate our new behavioral modality named Hand Movement, Orientation, and Grasp (H-MOG). This poster reports on the data collection process and outcomes, as well as preliminary authentication results.

DOI: 10.1145/2668332.2668366

Extracted Key Phrases

3 Figures and Tables

Cite this paper

@inproceedings{Yang2014AMD, title={A multimodal data set for evaluating continuous authentication performance in smartphones}, author={Qing Yang and Ge Peng and David T. Nguyen and Xin Qi and Gang Zhou and Zdenka Sitova and Paolo Gasti and Kiran S. Balagani}, booktitle={SenSys}, year={2014} }