Quantifying Privacy Loss of Human Mobility Graph Topology

@article{Manousakas2018QuantifyingPL,
  title={Quantifying Privacy Loss of Human Mobility Graph Topology},
  author={Dionysis Manousakas and Cecilia Mascolo and Alastair R. Beresford and Dennis Chan and Nikhil Sharma},
  journal={Proceedings on Privacy Enhancing Technologies},
  year={2018},
  volume={2018},
  pages={21 - 5}
}
Abstract Human mobility is often represented as a mobility network, or graph, with nodes representing places of significance which an individual visits, such as their home, work, places of social amenity, etc., and edge weights corresponding to probability estimates of movements between these places. Previous research has shown that individuals can be identified by a small number of geolocated nodes in their mobility network, rendering mobility trace anonymization a hard task. In this paper we… 

Figures and Tables from this paper

Evaluating privacy-friendly mobility analytics on aggregate location data

An end-to-end evaluation of crowdsourced privacy-friendly location aggregation aiming to understand its usefulness for analytics as well as its privacy implications towards users who contribute their data highlights that, while location aggregation is useful for mobility analytics, it is a weak privacy protection mechanism in this setting.

The Long Road to Computational Location Privacy: A Survey

The protection mechanisms between online and offline use cases are divided into six categories depending on the nature of their algorithm, and the evaluation metrics used to assess protection mechanisms in terms of privacy, utility and performance are surveyed.

The Long Road to Computational Location Privacy: A Survey.

The protection mechanisms between online and offline use cases are divided into six categories depending on the nature of their algorithm, and the evaluation metrics used to assess protection mechanisms in terms of privacy, utility and performance are surveyed.

Time-aware multi-resolutional approach to re-identifying location histories by using social networks

Evaluation using real data demonstrated the effectiveness of the proposed method even when linking only one pseudonymized and obfuscated location history to 1 of 10,000 social network accounts without any information about social relationships.

The Long Road to Computational Location Privacy *

The protection mechanisms between online and offline use cases are divided into six categories depending on the nature of their algorithm, and the evaluation metrics used to assess protection mechanisms in terms of privacy, utility and performance are surveyed.

Toward privacy in IoT mobile devices for activity recognition

A privacy-preserving framework for activity recognition that relies on a machine learning technique to efficiently recognise the user activity pattern, useful for personal healthcare monitoring, while limiting the risk of re-identification of users from biometric patterns that characterizes each individual.

Privacy-preserving IoT Framework for Activity Recognition in Personal Healthcare Monitoring

A framework that relies on machine learning to efficiently recognise the user activity, useful for personal healthcare monitoring, while limiting the risk of users re-identification from biometric patterns characterizing each individual is proposed.

Categorizing Uses of Communications Metadata: Systematizing Knowledge and Presenting a Path for Privacy

This work provides both an intellectual framework for thinking about the privacy implications of the use of communications metadata and a roadmap, with first steps taken, for providing privacy protections for users of electronic communications.

References

SHOWING 1-10 OF 51 REFERENCES

Trajectory Recovery From Ash: User Privacy Is NOT Preserved in Aggregated Mobility Data

This work develops an attack system that is able to exploit the uniqueness and regularity of human mobility to recover individual's trajectories from the aggregated mobility data without any prior knowledge, and reveals severe privacy leakage in such datasets.

What Does The Crowd Say About You? Evaluating Aggregation-based Location Privacy

A framework allowing us to reason about privacy against an adversary attempting to predict users’ locations or recover their mobility patterns is introduced, and the privacy loss stemming from aggregate location data is quantified, with and without the protection of differential privacy.

De-anonymization Attack on Geolocated Data

This work proposes an implementation of a specific inference attack called the de-anonymization attack, by which an adversary tries to infer the identity of a particular individual behind a set of mobility traces, based on a mobility model called Mobility Markov Chain.

Unique in the Crowd: The privacy bounds of human mobility

It is found that in a dataset where the location of an individual is specified hourly, and with a spatial resolution equal to that given by the carrier's antennas, four spatio-temporal points are enough to uniquely identify 95% of the individuals.

Privacy and the City: User Identification and Location Semantics in Location-Based Social Networks

This paper simulates a scenario in which the attacker's goal is to reveal the identity of a set of LBSN users by observing their check-in activity, and shows that different types of venues display different discriminative power in terms of user identity.

De-anonymizing Social Networks

A framework for analyzing privacy and anonymity in social networks is presented and a new re-identification algorithm targeting anonymized social-network graphs is developed, showing that a third of the users who can be verified to have accounts on both Twitter and Flickr can be re-identified in the anonymous Twitter graph.

Mobile user verification/identification using statistical mobility profile

The proposed method achieves a promising identification accuracy of 96% on average based on randomly chosen two users' data, which makes the framework feasible for the application of inferring the fraud usage of the smartphones.

Anonymization of location data does not work: a large-scale measurement study

This study shows that sharing anonymized location data will likely lead to privacy risks and that, at a minimum, the data needs to be coarse in either the time domain (meaning the data is collected over short periods of time, in which case inferring the top N locations reliably is difficult) or the space domain ( meaning the data granularity is strictly higher than the cell level).

On the Privacy Implications of Location Semantics

Inference models that consider location semantics and semantic privacy-protection mechanisms are introduced and evaluated by using datasets of semantic check-ins from Foursquare, totaling more than a thousand users in six large cities.

Anonymizing Social Networks

A framework for assessing the privacy risk of sharing anonymized network data is presented and a novel anonymization technique based on perturbing the network is proposed, demonstrating empirically that it leads to substantial reduction of the privacy threat.
...