Racial Disparity in Natural Language Processing: A Case Study of Social Media African-American English

Abstract

We highlight an important frontier in algorithmic fairness: disparity in the quality of natural language processing algorithms when applied to language from authors of di‚erent social groups. For example, current systems sometimes analyze the language of females and minorities more poorly than they do of whites and males. We conduct an empirical analysis of racial disparity in language identi€cation for tweets wriŠen in African-American English, and discuss implications of disparity in NLP.

1 Figure or Table

Cite this paper

@article{Blodgett2017RacialDI, title={Racial Disparity in Natural Language Processing: A Case Study of Social Media African-American English}, author={Su Lin Blodgett and Brendan T. O'Connor}, journal={CoRR}, year={2017}, volume={abs/1707.00061} }