• Corpus ID: 225075805

A Members First Approach to Enabling LinkedIn's Labor Market Insights at Scale

@article{Rogers2020AMF,
  title={A Members First Approach to Enabling LinkedIn's Labor Market Insights at Scale},
  author={Ryan M. Rogers and Adrian Rivera Cardoso and Koray Mancuhan and Akash Kaura and Nikhil T. Gahlawat and Neha Jain and Paul Ko and Parvez Ahammad},
  journal={ArXiv},
  year={2020},
  volume={abs/2010.13981}
}
We describe the privatization method used in reporting labor market insights from LinkedIn's Economic Graph, including the differentially private algorithms used to protect member's privacy. The reports show who are the top employers, as well as what are the top jobs and skills in a given country/region and industry. We hope this data will help governments and citizens track labor market trends during the COVID-19 pandemic while also protecting the privacy of our members. 
2 Citations

Figures and Tables from this paper

Visualizing Privacy-Utility Trade-Offs in Differentially Private Data Releases

Visualizing Privacy (ViP) is presented, an interactive interface that visualizes relationships between ɛ, accuracy, and disclosure risk to support setting and splitting ɚ among queries and has an inference setting, allowing a user to reason about the impact of DP noise on statistical inferences.

SoK: Differential privacies

This work lists all data privacy definitions based on differential privacy, and partition them into seven categories, depending on which aspect of the original definition is modified.

References

SHOWING 1-10 OF 13 REFERENCES

LinkedIn's Audience Engagements API: A Privacy Preserving Data Analytics System at Scale

A privacy system that leverages differential privacy to protect LinkedIn members' data while also providing audience engagement insights to enable marketing analytics related applications and a budget management service that enforces a strict differential privacy budget on the returned results to the analyst.

Google COVID-19 Community Mobility Reports: Anonymization Process Description (version 1.0)

This document describes the aggregation and anonymization process applied to the initial version of Google COVID-19 Community Mobility Reports (published at this http URL on April 2, 2020), a

Mechanism Design via Differential Privacy

It is shown that the recent notion of differential privacv, in addition to its own intrinsic virtue, can ensure that participants have limited effect on the outcome of the mechanism, and as a consequence have limited incentive to lie.

RAPPOR: Randomized Aggregatable Privacy-Preserving Ordinal Response

This paper describes and motivates RAPPOR, details its differential-privacy and utility guarantees, discusses its practical deployment and properties in the face of different attack models, and gives results of its application to both synthetic and real-world data.

Practical Differentially Private Top-k Selection with Pay-what-you-get Composition

This work designs algorithms that ensures (approximate) $(\epsilon,\delta>0)$-differential privacy and only needs access to the true top-$\bar{k}$ elements from the data for any chosen $\bar{ k} \geq k$.

Towards Practical Differential Privacy for SQL Queries

It is proved that elastic sensitivity is an upper bound on local sensitivity and can therefore be used to enforce differential privacy using any local sensitivity-based mechanism, and FLEX is built, a practical end-to-end system to enforcing differential privacy for SQL queries using elastic sensitivity.

Calibrating Noise to Sensitivity in Private Data Analysis

The study is extended to general functions f, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity of the function f, which is the amount that any single argument to f can change its output.

Our Data, Ourselves: Privacy Via Distributed Noise Generation

This work provides efficient distributed protocols for generating shares of random noise, secure against malicious participants, and introduces a technique for distributing shares of many unbiased coins with fewer executions of verifiable secret sharing than would be needed using previous approaches.

Collecting Telemetry Data Privately

This paper develops new LDP mechanisms geared towards repeated collection of counter data, with formal privacy guarantees even after being executed for an arbitrarily long period of time, which have been deployed by Microsoft to collect telemetry across millions of devices.

The modernization of statistical disclosure limitation at the U.S. Census Bureau

  • 2017