Inferring multimodal itineraries from rich smartphone data

Abstract

We designed a system to infer the multimodal itineraries traveled by a user from a combination of smartphone sensor data (e.g., GPS, Wi-Fi, inertial sensors), personal information, and knowledge of the transport network topology (e.g., maps, transportation timetables). The system operates with a Multimodal Transport Network that captures the set of admissible multimodal itineraries, i.e., paths of this network with weights providing the statistics (expected time and variance) of the paths. The network takes into account public transportation schedules. Our Multimodal Transport Network is constructed from publicly available transport data of Paris and its neighbourhoods published by different transport agencies and map organizations. The system models sensor uncertainty with probabilities, and the likelihood that a multimodal itinerary was taken by the user is captured in a Dynamic Bayesian Network. For this demonstration, we captured data from users travelling over the Paris region who were asked to record data for different trips via an Android application. After uploading their data into our system, a set of most likely itineraries is computed for each trip. For each trip, the system displays recognized multimodal itineraries and their estimated likelihood over an interactive map.

Extracted Key Phrases

4 Figures and Tables

Cite this paper

@inproceedings{Montoya2015InferringMI, title={Inferring multimodal itineraries from rich smartphone data}, author={David Montoya and Serge Abiteboul}, year={2015} }