Weapons of Math Destruction, Cathy O’Neil. New York: Broadway Books, 2017.
Summary: An insider account of the algorithms that affect our lives, from going to college, to the ads we see online, to our chances of getting a job, being arrested, getting credit and insurance.
Big Data is indeed BIG. Mathematical algorithms shape who will see this post on their Facebook newsfeed. If you go to Amazon or another online bookseller, algorithms will suggest other books like this one you might be interested in. Have you seen all those ads about credit scores? They are more important than you might imagine. Algorithms used by employers and insurance companies determine your employability and insurability in part through these scores. Far more than another credit card (bad idea, by the way) or a mortgage are on the line. These algorithms seem objective, but how they are formulated, and the assumptions made in doing so mean the difference between useful tools that benefit people, and “black boxes” that thwart the flourishing of others, often unknown to them.
Cathy O’Neil should know. A tenure track math professor, she made the jump to Wall Street and became a “quant” who helped develop mathematical algorithms and witnessed, in the crash of 2008, the harm some of these caused. And she began to notice how algorithms often painfully impacted the lives of many others. She describes how a teacher was fired because of the weighting of performance scores of a single class, despite other evaluations finding her an excellent teacher (afterwards it was found that there were a high number of erasures on tests for students who would have been in her class the previous year, suggesting these had been altered to improve scores).
As she looked at the algorithms responsible for such injustices, she came to dub them “Weapons of Math Destruction” or WMDs and she identified three characteristics of these WMDs:
- Opacity: those whose lives are affected by them have no idea of the factors and weighting of those factors that contributed to their “score”.
- Scale: how widely an algorithm is applied across industries and sectors of life can affect how much of one’s life is touched by a single formula. For example, the FICO scores mentioned above affect not only credit, but the ability to get a job, the cost of auto insurance, and your ability to rent an apartment.
- Damage: WMDs can reinforce other factors perpetuating a cycle of poverty, or incarceration.
She also shows that what makes these algorithms destructive is the use of proxy measurements. For example an employer may not know directly how savvy someone is as a marketer, and so they use a “proxy” measurement of how many Twitter followers that person has. Or age is used as a proxy for how safe a driver one is. For a group, the proxy may work well, and be utterly inaccurate for an individual that falls within that proxy group.
Then in successive chapters, she chronicles some of the ways WMDs operate in different parts of life. She discusses the U.S. News & World Report college rankings, and the use of algorithms in admissions processes. Social media uses algorithms to target advertising, which means some will see ads for for-profit schools and payday lenders, and others for upscale furnishing or Viagra, based on clicks, likes, searches, and comments. Policing strategies, including locations for intensified “stop and frisk” policing are shaped by another set of algorithms. Algorithms to filter resumes may use scoring algorithms that discriminate by address and psych exam algorithms may render others unemployable in a certain industry. Scheduling algorithms may promote efficiency at the expense of the ability of workers to sleep on a regular schedule, or arrange childcare, or work enough hours to qualify for health insurance. Algorithms sometimes shut people out from credit or low cost insurance when in fact they are good risks. She concludes by showing how algorithms determine ads and news we see (and don’t see). In an afterword she explores the flaws in algorithms revealed on the election of Donald Trump (algorithms, for example predicted Clinton would easily win Michigan and Wisconsin, where consequently she did not campaign, and lost by small margins).
In her conclusion, she makes the case not only for a code of ethics for mathematicians but also argues that regulation and audits of these algorithms are necessary. The value assumptions, as well as the mathematical methods of many algorithms are flawed, and yet opacity means those whose lives are most affected don’t even know what hit them.
She helps us see both the sinister and useful side of these algorithms. They may reveal where a pro-active intervention may save a family from descending into family violence, or provide extra assistance to a child in danger of falling behind in a key subject. Or they may be used to invade personal rights, or even to perpetuate socio-economic divides in a society. The reality is that the problem is not the math but the old GIGO problem (garbage in, garbage out). The values and assumptions of the humans who devise the formulas and weightings of values and the use of proxies determine what may be destructive outcomes for some people. Yet it can be hidden behind an app, a program, an algorithm.
The massive explosion in storage capacities, processing speeds, and the way our interests, health status, travel patterns, spending patterns, fitness, diet and sleep habits, our political inclinations and more may be tracked via our online and smartphone usage makes O’Neil’s warning an urgent one. We create mountains of data that may be increasingly mined by government and private interests. Perhaps as important as asking whether this will be governed in ways that contribute to our flourishing, is whether we will be alert enough to care.
Disclosure of Material Connection: I received this book free from the publisher. I was not required to write a positive review. The opinions I have expressed are my own.