Privacy for All: Ensuring Fair and Equitable Privacy Protections
@inproceedings{Ekstrand2018PrivacyFA, title={Privacy for All: Ensuring Fair and Equitable Privacy Protections}, author={Michael D. Ekstrand and Rezvan Joshaghani and Hoda Mehrpouyan}, booktitle={FAT}, year={2018} }
In this position paper, we argue for applying recent research on ensuring sociotechnical systems are fair and nondiscriminatory to the privacy protections those systems may provide. Privacy literature seldom considers whether a proposed privacy scheme protects all persons uniformly, irrespective of membership in protected classes or particular risk in the face of privacy failure. Just as algorithmic decision-making systems may have discriminatory outcomes even without explicit or deliberate…
45 Citations
Decision Making with Differential Privacy under a Fairness Lens
- Computer ScienceIJCAI
- 2021
It is shown that, when the decisions take as input differentially private data, the noise added to achieve privacy disproportionately impacts some groups over others.
Fair decision making using privacy-protected data
- Computer ScienceFAT*
- 2020
This first-of-its-kind study into the impact of formally private mechanisms (based on differential privacy) on fair and equitable decision-making investigates novel tradeoffs on two real-world decisions made using U.S. Census data as well as a classic apportionment problem.
PrivFair: a Library for Privacy-Preserving Fairness Auditing
- Computer ScienceArXiv
- 2022
PRIVFAIR is presented, a library for privacy-preserving fairness audits of ML models that protects the confidentiality of the model under audit and the sensitive data used for the audit, and supports scenarios in which a proprietary classifier owned by a company is audited using sensitive audit data from an external investigator.
On the Compatibility of Privacy and Fairness
- Computer ScienceUMAP
- 2019
This work investigates whether privacy and fairness can be simultaneously achieved by a single classifier in several different models and gives an efficient algorithm for classification that maintains utility and satisfies both privacy and approximate fairness with high probability.
Implications of Data Anonymization on the Statistical Evidence of Disparity
- Computer Science
- 2020
The paper develops conceptual foundation and mathematical formalism demonstrating that the two data anonymization mechanisms have distinctive impacts on the identifiability of disparity, which also varies based on its statistical operationalization.
Achieving Differential Privacy and Fairness in Logistic Regression
- Computer ScienceWWW
- 2019
This work develops differentially private and fair logistic regression models by combining functional mechanism and decision boundary fairness in a joint form and demonstrates their approaches effectively achieve both differential privacy and fairness while preserving good utility.
Post-processing of Differentially Private Data: A Fairness Perspective
- Computer ScienceArXiv
- 2022
This paper shows that postprocessing causes disparate impacts on individuals or groups and analyzes two critical settings: the release of differentially private datasets and the use of such private datasets for downstream decisions, such as the allocation of funds informed by US Census data.
Fairly Private Through Group Tagging and Relation Impact
- Computer ScienceMDAI
- 2021
A case study on gender equality in the admission system and a satisfying result implies that the proposed architecture achieves the group fairness and optimal privacy-utility trade-off for both the numerical and decision making Queries.
Trade-Offs between Fairness and Privacy in Machine Learning
- Computer Science
- 2020
An impossibility theorem is proved which shows that even in simple binary classification settings, one cannot design an accurate learning algorithm that is both -differentially private and fair (even approximately).
Investigating Trade-offs in Utility, Fairness and Differential Privacy in Neural Networks
- Computer ScienceArXiv
- 2021
The DPF-NN was found to achieve better risk difference than all the other neural networks with only a marginally lower accuracy than the S-NN and DP-NN, and this model is considered fair as it achieved a risk difference below the strict and lenient thresholds.
References
SHOWING 1-10 OF 55 REFERENCES
Privacy and the Limits of Law
- Law
- 1980
A path-breaking analysis of the concept of privacy as a question of access to the individual and to information about him. An account of the reasons why privacy is valuable, and why it has the…
PHILOSOPHICAL THEORIES OF PRIVACY: IMPLICATIONS FOR AN ADEQUATE ONLINE PRIVACY POLICY
- Law
- 2007
It is shown how RALC can help to frame an online privacy policy that is sufficiently comprehensive in scope to address a wide range of privacy concerns that arise in connection with computers and information technology.
Privacy in Context - Technology, Policy, and the Integrity of Social Life
- Political Science
- 2009
Arguing that privacy concerns should not be limited solely to concern about control over personal information, Helen Nissenbaum counters that information ought to be distributed and protected according to norms governing distinct social context, be it workplace, health care, schools, or among family and friends.
The So-Called Right to Privacy
- Law
- 2009
The constitutional right to privacy has been a conservative bugaboo ever since Justice Douglas introduced it into the United States Reports in Griswold v. Connecticut. Reference to the 'so-called'…
Privacy as contextual integrity
- Political Science
- 2004
The practices of public surveillance, which include the monitoring of individuals in public through a variety of media (e.g., video, data, online), are among the least understood and controversial…
Privacy protection, control of information, and privacy-enhancing technologies
- Computer ScienceCSOC
- 2001
It is argued that even if PETs provide individuals with a means of controlling their personal information, these tools do not necessarily ensure privacy protection and it is condude that the use of PETs can actually blur the need for privacy protection, rather than provide it.
How Much Is Enough? Choosing ε for Differential Privacy
- Computer ScienceISC
- 2011
The probability of identifying any particular individual as being in the database is considered, and the challenge of setting the proper value of e given the goal of protecting individuals in thedatabase with some fixed probability is demonstrated.
Discrimination Prevention using Privacy Preserving Techniques
- Computer Science
- 2015
This paper is trying to propose a method in which privacy preserving technique can be used to prevent discrimination and the original data can be made both privacy protected and discrimination-free.
k-Anonymity: A Model for Protecting Privacy
- Computer ScienceInt. J. Uncertain. Fuzziness Knowl. Based Syst.
- 2002
The solution provided in this paper includes a formal protection model named k-anonymity and a set of accompanying policies for deployment and examines re-identification attacks that can be realized on releases that adhere to k- anonymity unless accompanying policies are respected.
Discrimination- and privacy-aware patterns
- Computer ScienceData Mining and Knowledge Discovery
- 2014
It is argued that privacy and discrimination risks should be tackled together, and a methodology for doing so while publishing frequent pattern mining results is presented, and pattern sanitization methods based on $$k$$k-anonymity yield both privacy- and discrimination-protected patterns, while introducing reasonable (controlled) pattern distortion.