Companies and governments need to pay attention to the unconscious and institutional biases that seep into their algorithms, argues cybersecurity expert Megan Garcia. Distorted data can skew results in web searches, home loan decisions, or photo recognition software. But the combination of increased attention to the inputs, greater clarity about the properties of the code itself, and the use of crowd-level monitoring could contribute to a more equitable online world. Without careful consideration, Garcia writes, our technology will be just as racist, sexist, and xenophobic as we are.

You do not currently have access to this content.