Companies and governments need to pay attention to the unconscious and institutional biases that seep into their algorithms, argues cybersecurity expert Megan Garcia. Distorted data can skew results in web searches, home loan decisions, or photo recognition software. But the combination of increased attention to the inputs, greater clarity about the properties of the code itself, and the use of crowd-level monitoring could contribute to a more equitable online world. Without careful consideration, Garcia writes, our technology will be just as racist, sexist, and xenophobic as we are.
Research Article|December 01 2016
Racist in the Machine: The Disturbing Implications of Algorithmic Bias
World Policy Journal (2016) 33 (4): 111-117.
Megan Garcia; Racist in the Machine: The Disturbing Implications of Algorithmic Bias. World Policy Journal 1 December 2016; 33 (4): 111–117. doi: https://doi.org/10.1215/07402775-3813015
Download citation file: