Corpus ID: 221082322

Feature Noise Induces Loss Discrepancy Across Groups

@inproceedings{Khani2020FeatureNI,
  title={Feature Noise Induces Loss Discrepancy Across Groups},
  author={Fereshte Khani and Percy Liang},
  booktitle={ICML 2020},
  year={2020}
}
  • Fereshte Khani, Percy Liang
  • Published in ICML 2020
  • Physics, Computer Science, Mathematics
  • The performance of standard learning procedures has been observed to differ widely across groups. Recent studies usually attribute this loss discrepancy to an information deficiency for one group (e.g., one group has less data). In this work, we point to a more subtle source of loss discrepancy— feature noise. Our main result is that even when there is no information deficiency specific to one group (e.g., both groups have infinite data), adding the same amount of feature noise to all… CONTINUE READING

    Figures and Tables from this paper.

    References

    SHOWING 1-10 OF 50 REFERENCES
    Maximum Weighted Loss Discrepancy
    • 3
    • PDF
    On Fairness and Calibration
    • 221
    • PDF
    Does mitigating ML's impact disparity require treatment disparity?
    • 69
    • PDF
    Decoupled Classifiers for Group-Fair and Efficient Machine Learning
    • 98
    • PDF
    Delayed Impact of Fair Machine Learning
    • 149
    • PDF
    From Soft Classifiers to Hard Decisions: How fair can we be?
    • 15
    • PDF
    Fairness Beyond Disparate Treatment & Disparate Impact: Learning Classification without Disparate Mistreatment
    • 450
    • Highly Influential
    • PDF
    Equality of Opportunity in Supervised Learning
    • 1,112
    • Highly Influential
    • PDF
    Fairness through Causal Awareness: Learning Causal Latent-Variable Models for Biased Data
    • 33
    • Highly Influential
    Learning Non-Discriminatory Predictors
    • 131
    • PDF