"I need a better description": An Investigation Into User Expectations For Differential Privacy

  title={"I need a better description": An Investigation Into User Expectations For Differential Privacy},
  author={Rachel Cummings and Gabriel Kaptchuk and Elissa M. Redmiles},
  journal={Proceedings of the 2021 ACM SIGSAC Conference on Computer and Communications Security},
Despite recent widespread deployment of differential privacy, relatively little is known about what users think of differential privacy. In this work, we seek to explore users' privacy expectations related to differential privacy. Specifically, we investigate (1) whether users care about the protections afforded by differential privacy, and (2) whether they are therefore more willing to share their data with differentially private systems. Further, we attempt to understand (3) users' privacy… 

Figures and Tables from this paper

"Am I Private and If So, how Many?" - Using Risk Communication Formats for Making Differential Privacy Understandable
This paper adapt risk communication formats in conjunction with a model for the privacy risks of DP and finds that they perform similarly to the best performing DP communications used currently in terms of objective understanding, but did not make participants as confident in their understanding.
From Algorithmic to Institutional Logics: The Politics of Differential Privacy
This paper investigates the political dimensions of differential privacy, describing the entanglements between algorithmic privacy and institutional logics and highlighting disempowering practices that may emerge despite, or in response to, the adoption of differentially private methods.
Using Illustrations to Communicate Differential Privacy Trust Models: An Investigation of Users' Comprehension, Perception, and Data Sharing Decision
Explanative illustrations of three DP models are designed to help laypeople conceptualize how random noise is added to protect individuals’ privacy and preserve group utility and found that the illustrations can be effective in communicating DP to the participants.
Visualizing Privacy-Utility Trade-Offs in Differentially Private Data Releases
Visualizing Privacy (ViP) is presented, an interactive interface that visualizes relationships between É›, accuracy, and disclosure risk to support setting and splitting Éš among queries and has an inference setting, allowing a user to reason about the impact of DP noise on statistical inferences.
Local Differential Privacy for Belief Functions
In this paper, we propose two new definitions of local differential privacy for belief functions. One is based on Shafer’s semantics of randomly coded messages and the other from the perspective of…
SoK: Machine Learning Governance
The approach first systematizes research towards ascertaining ownership of data and models, thus fostering a notion of identity specific to ML systems, and uses identities to hold principals accountable for failures of ML systems through both attribution and auditing.
Differentially Private Graph Classification with GNNs
This work introduces differential privacy for graph-level classification, one of the key applications of machine learning on graphs, which is applicable to deep learning on multi-graph datasets and relies on differentially private stochastic gradient descent (DP-SGD).
Statistical Data Privacy: A Song of Privacy and Utility
The statistical foundations common to both SDC and DP are discussed, major developments in SDP are highlighted, and exciting open research problems in private inference are presented.
SoK: Differential Privacy on Graph-Structured Data
This work systematise different formulations of DP on graphs, discuss challenges and promising applications, including the GNN domain, and compares and separate works into graph analysis tasks and graph learning tasks with GNNs.


Towards Understanding Differential Privacy: When Do People Trust Randomized Response Technique?
It is found that allowing individuals to see the amount of obfuscation applied to their responses increased their trust in the privacy-protecting mechanism, and it is demonstrated that prudent privacy-related decisions can be cultivated with simple explanations of usable privacy.
Privacy policies as decision-making tools: an evaluation of online privacy notices
This paper evaluates the usability of online privacy policies, as well as the practice of posting them, and determines that significant changes need to be made to current practice to meet regulatory and usability requirements.
How Well Do My Results Generalize? Comparing Security and Privacy Survey Results from MTurk, Web, and Telephone Samples
These findings lend tempered support for the generalizability of prior crowdsourced security and privacy user studies; provide context to more accurately interpret the results of such studies; and suggest rich directions for future work to mitigate experience- rather than demographic-related sample biases.
"My Data Just Goes Everywhere: " User Mental Models of the Internet and Implications for Privacy and Security
A qualitative study to understand what people do and do not know about the Internet and how that knowledge affects their responses to privacy and security risks suggests a greater emphasis on policies and systems that protect privacy andSecurity without relying too much on users' security practices.
Towards Effective Differential Privacy Communication for Users’ Data Sharing Decision and Comprehension
When shown descriptions that explain the implications instead of the definition/processes of DP or LDP technique, participants demonstrated better comprehension and showed more willingness to share information with LDP than with DP, indicating their understanding of LDP’s stronger privacy guarantee compared with DP.
Engineering Privacy
The paper uses a three-layer model of user privacy concerns to relate them to system operations and examine their effects on user behavior, and develops guidelines for building privacy-friendly systems.
Examining Internet privacy policies within the context of user privacy values
Examining Internet users' major expectations about website privacy and revealed a notable discrepancy between what privacy policies are currently stating and what users deem most significant are suggested to privacy managers and software project managers.
Differential Privacy: A Primer for a Non-Technical Audience
This primer aims to provide a foundation that can guide future decisions when analyzing and sharing statistical data about individuals, informing individuals about the privacy protection they will be afforded, and designing policies and regulations for robust privacy protection.
Home is safer than the cloud!: privacy concerns for consumer cloud storage
The results show that privacy requirements for consumer cloud storage differ from those of companies, and that cultural differences greatly influence user attitudes and beliefs, such as their willingness to store sensitive data in the cloud and their acceptance that law enforcement agencies monitor user accounts.
Differential Privacy and Social Science: An Urgent Puzzle
In the discussion around privacy risks and data protection, a large number of disciplines must band together to solve this urgent puzzle of the authors' time, including social science, computer science, ethics, law, and statistics, as well as public and private policy.