You are probably aware that the material that shows up in your newsfeed on Facebook or Twitter is only a fraction of what your friends and connections are posting, and some of it is sponsored content tailored for you. Have you every wondered why you are seeing what you are seeing? Algorithms (and a lot of data collected about you).
A similar kind of thing happens when you search on Google. I am surprised how often it works well and I find exactly what I’m looking for. But sometimes it goes sideways. Why for example, when Dylann Roof searched Google, following up a search on Trayvon Martin with a search on “black on white crime” did the top search choices come up as white supremacist organizations? Algorithms.
Why, when I search for a book on Amazon, do I receive a number of recommendations of books in the form of “because you looked at this, you might like this”? I have similar things occur at Barnes & Noble or at Thriftbooks. Why? Algorithms.
One definition of an algorithm I found is: “a process or set of rules to be followed in calculations or other problem-solving operations, especially by a computer.” Actually, algorithms are not some arcane mathematical art. They may be as simple as the process we use to solve a long division problem or the process we use in doing our laundry.
What is happening with all these algorithms is that somebody, an individual or group, has established a set of rules to determine what you see. What much of the world became aware of when Frances Haugen appeared on 60 Minutes this past Sunday night is that the ethic behind these rules that form the algorithms is very important. Haugen alleged, producing massive documentation, that Facebook has consistently chosen profit over safety on its various platforms. For one thing, it selected content that fostered anger on its newsfeeds despite the fact that it often spread misinformation and fostered division simply because this kept people on the platform longer, which was where the money is. Another prime example is the impact that it was aware of Instagram having on teenage girls. Not only do glamorous images feed body-shame, but they discovered that the shame and depression keeps girls on the platform in an emotionally destructive spiral. They knew this and did nothing to change their algorithms of what these girls saw.
Computer-based algorithms are widely used for everything from fantasy baseball to mortgage application processing to screening resumes to your FICO score. People are not directly making decisions about what we see online or about our finances or career aspirations. Machines are making the decisions, using the rules programmers establish in the code.
There are at least a few key ethical considerations that rise to the top, highlighted in Cathy O’Neill’s, Weapons of Math Destruction:
- How opaque or transparent are the rules used in the algorithm? Most of the time, the algorithms are highly opaque and we know that we’ve been affected but we don’t know why. Because of this, questions of fairness often arise–how do factors of race, gender, age, etc. get factored in?
- What is the scale of impact of the algorithm? FICO scores affect credit, auto insurance costs, getting hired or promoted, and being able to rent an apartment or buy a house. How will this algorithm be used in the marketplace and what protects individuals from wrongful harm?
- What is the damage this could cause? Where possible, this should be considered proactively. For example, on social media, under what conditions is more engagement harmful to persons or the broader social context?
This is not easily done, particularly because algorithms serve beneficial purposes as well as cause harm. At very least, identifying the real instances of unfairness and harm and eliminating these, or better, anticipating them, seems a place to start. What is most egregious about the content of Frances Haugen’s testimony was that internal studies were showing known harms from platform algorithms that were not addressed because of profit considerations. We should never use complicated ethical questions to forestall dealing with the clear-cut ones. Let’s begin here.