Ethics For Algorithms

Photo by Antonio Batiniu0107 on Pexels.com

You are probably aware that the material that shows up in your newsfeed on Facebook or Twitter is only a fraction of what your friends and connections are posting, and some of it is sponsored content tailored for you. Have you every wondered why you are seeing what you are seeing? Algorithms (and a lot of data collected about you).

A similar kind of thing happens when you search on Google. I am surprised how often it works well and I find exactly what I’m looking for. But sometimes it goes sideways. Why for example, when Dylann Roof searched Google, following up a search on Trayvon Martin with a search on “black on white crime” did the top search choices come up as white supremacist organizations? Algorithms.

Why, when I search for a book on Amazon, do I receive a number of recommendations of books in the form of “because you looked at this, you might like this”? I have similar things occur at Barnes & Noble or at Thriftbooks. Why? Algorithms.

One definition of an algorithm I found is: “a process or set of rules to be followed in calculations or other problem-solving operations, especially by a computer.” Actually, algorithms are not some arcane mathematical art. They may be as simple as the process we use to solve a long division problem or the process we use in doing our laundry.

What is happening with all these algorithms is that somebody, an individual or group, has established a set of rules to determine what you see. What much of the world became aware of when Frances Haugen appeared on 60 Minutes this past Sunday night is that the ethic behind these rules that form the algorithms is very important. Haugen alleged, producing massive documentation, that Facebook has consistently chosen profit over safety on its various platforms. For one thing, it selected content that fostered anger on its newsfeeds despite the fact that it often spread misinformation and fostered division simply because this kept people on the platform longer, which was where the money is. Another prime example is the impact that it was aware of Instagram having on teenage girls. Not only do glamorous images feed body-shame, but they discovered that the shame and depression keeps girls on the platform in an emotionally destructive spiral. They knew this and did nothing to change their algorithms of what these girls saw.

Computer-based algorithms are widely used for everything from fantasy baseball to mortgage application processing to screening resumes to your FICO score. People are not directly making decisions about what we see online or about our finances or career aspirations. Machines are making the decisions, using the rules programmers establish in the code.

There are at least a few key ethical considerations that rise to the top, highlighted in Cathy O’Neill’s, Weapons of Math Destruction:

  1. How opaque or transparent are the rules used in the algorithm? Most of the time, the algorithms are highly opaque and we know that we’ve been affected but we don’t know why. Because of this, questions of fairness often arise–how do factors of race, gender, age, etc. get factored in?
  2. What is the scale of impact of the algorithm? FICO scores affect credit, auto insurance costs, getting hired or promoted, and being able to rent an apartment or buy a house. How will this algorithm be used in the marketplace and what protects individuals from wrongful harm?
  3. What is the damage this could cause? Where possible, this should be considered proactively. For example, on social media, under what conditions is more engagement harmful to persons or the broader social context?

This is not easily done, particularly because algorithms serve beneficial purposes as well as cause harm. At very least, identifying the real instances of unfairness and harm and eliminating these, or better, anticipating them, seems a place to start. What is most egregious about the content of Frances Haugen’s testimony was that internal studies were showing known harms from platform algorithms that were not addressed because of profit considerations. We should never use complicated ethical questions to forestall dealing with the clear-cut ones. Let’s begin here.

Review: Weapons of Math Destruction

weapons of math destruction

Weapons of Math DestructionCathy O’Neil. New York: Broadway Books, 2017.

Summary: An insider account of the algorithms that affect our lives, from going to college, to the ads we see online, to our chances of getting a job, being arrested, getting credit and insurance.

Big Data is indeed BIG. Mathematical algorithms shape who will see this post on their Facebook newsfeed. If you go to Amazon or another online bookseller, algorithms will suggest other books like this one you might be interested in. Have you seen all those ads about credit scores? They are more important than you might imagine. Algorithms used by employers and insurance companies determine your employability and insurability in part through these scores. Far more than another credit card (bad idea, by the way) or a mortgage are on the line. These algorithms seem objective, but how they are formulated, and the assumptions made in doing so mean the difference between useful tools that benefit people, and “black boxes” that thwart the flourishing of others, often unknown to them.

Cathy O’Neil should know. A tenure track math professor, she made the jump to Wall Street and became a “quant” who helped develop mathematical algorithms and witnessed, in the crash of 2008, the harm some of these caused. And she began to notice how algorithms often painfully impacted the lives of many others.  She describes how a teacher was fired because of the weighting of performance scores of a single class, despite other evaluations finding her an excellent teacher (afterwards it was found that there were a high number of erasures on tests for students who would have been in her class the previous year, suggesting these had been altered to improve scores).

As she looked at the algorithms responsible for such injustices, she came to dub them “Weapons of Math Destruction” or WMDs and she identified three characteristics of these WMDs:

  1. Opacity: those whose lives are affected by them have no idea of the factors and weighting of those factors that contributed to their “score”.
  2. Scale: how widely an algorithm is applied across industries and sectors of life can affect how much of one’s life is touched by a single formula. For example, the FICO scores mentioned above affect not only credit, but the ability to get a job, the cost of auto insurance, and your ability to rent an apartment.
  3. Damage: WMDs can reinforce other factors perpetuating a cycle of poverty, or incarceration.

She also shows that what makes these algorithms destructive is the use of proxy measurements. For example an employer may not know directly how savvy someone is as a marketer, and so they use a “proxy” measurement of how many Twitter followers that person has. Or age is used as a proxy for how safe a driver one is. For a group, the proxy may work well, and be utterly inaccurate for an individual that falls within that proxy group.

Then in successive chapters, she chronicles some of the ways WMDs operate in different parts of life. She discusses the U.S. News & World Report college rankings, and the use of algorithms in admissions processes. Social media uses algorithms to target advertising, which means some will see ads for for-profit schools and payday lenders, and others for upscale furnishing or Viagra, based on clicks, likes, searches, and comments. Policing strategies, including locations for intensified “stop and frisk” policing are shaped by another set of algorithms. Algorithms to filter resumes may use scoring algorithms that discriminate by address and psych exam algorithms may render others unemployable in a certain industry. Scheduling algorithms may promote efficiency at the expense of the ability of workers to sleep on a regular schedule, or arrange childcare, or work enough hours to qualify for health insurance. Algorithms sometimes shut people out from credit or low cost insurance when in fact they are good risks. She concludes by showing how algorithms determine ads and news we see (and don’t see). In an afterword she explores the flaws in algorithms revealed on the election of Donald Trump (algorithms, for example predicted Clinton would easily win Michigan and Wisconsin, where consequently she did not campaign, and lost by small margins).

In her conclusion, she makes the case not only for a code of ethics for mathematicians but also argues that regulation and audits of these algorithms are necessary. The value assumptions, as well as the mathematical methods of many algorithms are flawed, and yet opacity means those whose lives are most affected don’t even know what hit them.

She helps us see both the sinister and useful side of these algorithms. They may reveal where a pro-active intervention may save a family from descending into family violence, or provide extra assistance to a child in danger of falling behind in a key subject. Or they may be used to invade personal rights, or even to perpetuate socio-economic divides in a society. The reality is that the problem is not the math but the old GIGO problem (garbage in, garbage out). The values and assumptions of the humans who devise the formulas and weightings of values and the use of proxies determine what may be destructive outcomes for some people. Yet it can be hidden behind an app, a program, an algorithm.

The massive explosion in storage capacities, processing speeds, and the way our interests, health status, travel patterns, spending patterns, fitness, diet and sleep habits, our political inclinations and more may be tracked via our online and smartphone usage makes O’Neil’s warning an urgent one. We create mountains of data that may be increasingly mined by government and private interests. Perhaps as important as asking whether this will be governed in ways that contribute to our flourishing, is whether we will be alert enough to care.

____________________________

Disclosure of Material Connection: I received this book free from the publisher. I was not required to write a positive review. The opinions I have expressed are my own.