Bias in AI models — Weapons of Math Destruction

Daniel Johnson
2 min readFeb 21, 2021

I was tasked by my artificial intelligence class to write about bias in AI models, and to bring some information from Cathy O’Neal’s book, “Weapons of Math Destruction.”

According to O’Neal, a model “is nothing more than an abstract representation of some process, be it a baseball game, an oil company’s supply chain, a foreign government’s actions, or a movie theater’s attendance. Whether it’s running in a computer program or in our head, the model takes what we know and uses it to predict responses in various situations.”

To talk about biased models, it is better to start by showing an example of a unbiased model. O’Neal uses “moneyball” as an example, using statistics to predict outcomes in baseball. Baseball’s data is public, constantly being added to, and directly reflects on what has happened in the game. Having public data is good because it is open to public scrutiny, but bad if you want to make the newest state of the art click predictor in Silicon Valley. Data constantly changing is helpful because the game constantly changes and people constantly change. If someone starts hitting more shots to the left, the data will reflect that. Also, there is obviously no bias in the contents of the data because it is directly based on events that happened. However, not all models are this fair.

A model can be biased in many ways. The largest of which is when a model does not have enough data, or when that data is not relevant enough for the task at hand. So, when an entity decides that their data is not good enough, they look for the most money-efficient way to improve it; This is capitalism after all. They make proxies, which are fake data points that the creators believe to be accurate to the whole population. It is very apparent that this is subject to bias. They are subjecting the model to what the creators believe. One example that O’Neal gives is that a model might base its decision on whether to give someone a loan or not on someone’s zip code. The assumption that someone who lives in a poor area will not pay back a loan is a discriminatory bias. This is what O’Neal calls a WMD, weapon of math destruction.

--

--

Daniel Johnson
0 Followers

Computer Science student at KU. Main interests: Machine Learning and AI