SPEAKER_05: Apple Card is the perfect credit card for every purchase. It has cash-back rewards unlike others. You earn unlimited daily cash back on every purchase, receive it daily, and can grow it at a 4.15% annual percentage yield when you open a high-yield savings account. Apply for Apple Card in the Wallet app on iPhone and start earning and growing your daily cash with savings today. Apple Card subject to credit approval, savings is available to Apple Card owners subject to eligibility requirements. Savings accounts provided by Goldman Sachs Bank USA. Member FDIC. Charms apply. Every kid learns differently, so it's really important that your children have the educational support that they need to help them keep up and excel. If your child needs homework help, check out iXcel, the online learning platform for kids. iXcel covers math, language arts, science, and social studies through interactive practice problems from pre-K to 12th grade. As kids practice, they get positive feedback and even awards. With the school year ramping up, now is the best time to get iXcel. Our listeners can get an exclusive 20% off iXcel membership when they sign up today at ixcel.com slash invisible. That's the letters ixl.com slash invisible. Squarespace is the all-in-one platform for building your brand and growing your business online. Stand out with a beautiful website, engage with your audience, and sell anything. Your products, content you create, and even your time. You can easily display posts from your social profiles on your website or share new blogs or videos to social media. Automatically push website content to your favorite channels so your followers can share it too. Go to squarespace.com slash invisible for a free trial and when you're ready to launch, use the offer code invisible to save 10% off your first purchase of a website or domain. This is 99% invisible. I'm Roman Mars. On April 9th, 2017, United Airlines Flight 3411 was about to fly from Chicago to Louisville when flight attendants discovered the plane was overbooked. They tried to get volunteers to give up their seats with promises of travel vouchers and hotel accommodations. But not enough people were willing to get off.
SPEAKER_05: United ended up calling some airport security officers. They boarded the plane and forcibly removed a passenger named Dr. David Dow. The officers ripped Dow out of his seat and carried him down the aisle on the airplane, nose bleeding, while horrified onlookers shot video with their phones.
SPEAKER_05: You probably remember this incident and the outrage it generated.
SPEAKER_01: The international uproar continued over the forced removal of a passenger from a United Airlines flight. Today, the airline's CEO, Oscar Munoz, issued an apology saying, quote, no one should ever be mistreated this way. I want you to know that we take full responsibility.
SPEAKER_05: But why Dr. Dow? How did he end up being the unlucky passenger that United decided to remove? Immediately following the incident, some people thought racial discrimination may have played a part and it's possible that this played a role in how he was treated. But the answer to how he was chosen was actually an algorithm, a computer program. It crunched through a bunch of data looking at stuff like how much each passenger had paid for their ticket, what time they checked in, how often they flew on United and whether they were part of a rewards program. The algorithm likely determined that Dr. Dow was one of the least valuable customers on the flight at the time. Algorithms shape our world in profound and mostly invisible ways. They predict if we'll be valuable customers or whether we're likely to repay a loan. They filter what we see on social media, sort through resumes and evaluate job performance. They inform prison sentences and monitor our health. Most of these algorithms have been created with good intentions. The goal is to replace subjective judgments with objective measurements. But it doesn't always work out like that. This subject is huge. I think algorithm design may be the big design problem of the 21st century. And that's why I wanted to interview Kathy O'Neill. OK, well, thank you so much. So can we start? Can you give me one of them sort of NPR style introductions and just say your name and what you do?
SPEAKER_02: Sure. I'm Kathy O'Neill. I'm a mathematician, data scientist, activist and author. I wrote the book Weapons of Math Destruction, How Big Data Increases Inequality and Threatens Democracy.
SPEAKER_05: O'Neill studied number theory and then left academia to build predictive algorithms for a hedge fund. But she got really disillusioned by the use of mathematical models in the financial industry.
SPEAKER_02: I wanted to have more impact in the world, but I didn't really know that that impact could be really terrible. I was very naive.
SPEAKER_05: After that, O'Neill worked as a data scientist at a couple of startups. And through these experiences, she started to get worried about the influence of poorly designed algorithms. So we'll start with the most obvious question. What is an algorithm? At its most basic, an algorithm is a step by step guide to solving a problem. It's a set of instructions, like a recipe.
SPEAKER_02: The example I like to give is like cooking dinner for my family.
SPEAKER_05: So in this case, the problem is how to make a successful dinner. O'Neill starts with a set of ingredients. And as she's creating the meal, she's constantly making choices about what ingredients are healthy enough to include in her dinner algorithm.
SPEAKER_02: I curate that data because those ramen noodle packages that my kids like so much, I don't think of those as ingredients, right? So I exclude them. So I'm curating and I'm therefore imposing my agenda on this algorithm.
SPEAKER_05: In addition to curating the ingredients, O'Neill as the cook also defines what a successful outcome looks like.
SPEAKER_02: I'm also defining success, right? I'm in charge of success. I define success to be if my kids eat vegetables at that meal. And you know, a different cook might define success differently.
SPEAKER_05: You know, my eight-year-old would define success to be like whether he got to eat Nutella.
SPEAKER_02: So that's another way where we, the builders impose our agenda on the algorithm.
SPEAKER_05: O'Neill's main point here is that algorithms aren't really objective, even when they're carried out by computers. This is relevant because the companies that build them like to market them as objective, claiming they remove human error and fallibility from complex decision-making. But every algorithm reflects the priorities and judgments of its human designer. Of course, that doesn't necessarily make algorithms bad.
SPEAKER_02: Right so I mean, it's very important to me that I don't get the reputation of hating all algorithms. I actually like algorithms and I think algorithms could really help.
SPEAKER_05: But O'Neill does single out a particular kind of algorithm for scrutiny. These are the ones we should worry about.
SPEAKER_02: And they're characterized by three properties that they're very widespread and important. So like they make important decisions about a lot of people. Number two, that they're secret, that the people don't understand how they're being scored. And number three, that they're destructive. Like one bad mistake in the design, if you will, of these algorithms will actually not only make it unfair for individuals, but sort of categorically unfair for enormous populations as it gets scaled up.
SPEAKER_05: O'Neill has a shorthand for these algorithms, the widespread, mysterious, and destructive ones. She calls them weapons of math destruction. To show how one of these destructive algorithms works, O'Neill points to the criminal justice system. For hundreds of years, key decisions in the legal process, like the amount of bail, length of sentence, and likelihood of parole, have been in the hands of fallible human beings guided by their instincts and sometimes their personal biases.
SPEAKER_02: The judges are sort of famously racist, some of them more than others.
SPEAKER_05: And that racism can produce very different outcomes for defendants. For example, the ACLU has found that sentences imposed on black men in the federal system are nearly 20 percent longer than those for white men convicted of similar crimes. And studies have shown that prosecutors are more likely to seek the death penalty for African Americans than for whites convicted of the same charges. So you might think that computerized models fed by data would contribute to more even-handed treatment. The criminal justice system thinks so, too. It has increasingly tried to minimize human bias by turning to risk assessment algorithms.
SPEAKER_02: Like crime risk. Like what is the chance of someone coming back to prison after leaving it?
SPEAKER_05: Many of these risk algorithms look at a person's record of arrests and convictions. The problem is that data is already skewed by some social realities. Take for example the fact that white people and black people use marijuana at roughly equal rates. And yet, there's five times as many blacks getting arrested for smoking pot as whites.
SPEAKER_02: Five times as many.
SPEAKER_05: This may be because black neighborhoods tend to be more heavily policed than white neighborhoods, which means black people get arrested for certain crimes more often than white people. Risk algorithms detect these patterns and apply them to the future. So if the past is shaped in part by racism, the future will be, too.
SPEAKER_02: The larger point is we have terrible data here, but the statisticians involved, the data scientists, are like blithely going forward and pretending that our data is good. And then we're using it to actually make important decisions.
SPEAKER_05: Risk assessment algorithms also look at a defendant's answers to a questionnaire that's supposed to tease out certain risk factors.
SPEAKER_02: They have questions like, you know, did you grow up in a high crime neighborhood? Are you on welfare? Do you have a mental health problem? Do you have addiction problems? Did your father go to prison? You know, they're basically proxies for race and class, but it's embedded in this scoring system and the judge is given the score and it's called objective.
SPEAKER_05: What does the judge take away from it or, you know, how is it used?
SPEAKER_02: If you have a high risk score, it's used to send you to prison for longer in sentencing. It's also used in bail hearings and parole hearings. If you have a high recidivism risk score, you don't get parole.
SPEAKER_05: And presumably you could take all that biased input data and say this high chance of recidivism means that we should rehabilitate more. I mean, you could take all that same stuff and choose to do a completely different thing with the result of the algorithm.
SPEAKER_02: That's exactly my point. Exactly my point. We could say, oh, I wonder why people who have this characteristic have so much worse recidivism. Well, let's try to help them find a job. Maybe that'll help. We could use those algorithms, those risk scores to try to account for our society.
SPEAKER_05: Instead, O'Neill says, in many cases, we're effectively penalizing people for societal and structural issues that they have little control over. And we're doing it at a massive scale using these new technological tools.
SPEAKER_02: We're shifting the blame, if you will, from the society, which is the one that should own these problems, to the individual and punishing them for it.
SPEAKER_05: It should be said that, in some cases, algorithms are helping to change elements of the criminal justice system for the better. For example, New Jersey recently did away with their cash bail system, which disadvantaged low-income defendants. They now rely on predictive algorithms instead. Data shows that the state's pretrial county jail populations are down by about 20 percent. But still, algorithms like that one remain unaudited and unregulated. And it's a problem when algorithms are basically black boxes. In many cases, they're designed by private companies who sell them to other companies, in the exact details of how they work on Kept Secret. Not only is the public in the dark, even the companies using these things might not understand exactly how the data is being processed. This is true of many of the problematic algorithms that O'Neill has looked at, whether they're used for sorting loan applications or assessing teacher performance.
SPEAKER_02: There's some kind of weird thing that happens to people when mathematical scores are trotted out. They just start closing their eyes and believing it because it's math. And they do, I feel like, oh, I'm not an expert of math, so I can't push back. And that's something you just see time and time again. You're like, why didn't you question this? This doesn't make sense. Oh, well, it's math and I don't understand it.
SPEAKER_05: Right now, it seems like because of algorithms and math, it's just a new place to place blame so that you do not have to think about your decisions as an actual company because these things are just so powerful and so mesmerizing to us, especially right now. They can be used in all kinds of nefarious ways. They're almost magical.
SPEAKER_05: Yeah. That's scary. It's scary.
SPEAKER_02: I think I would go one step further than that. I feel like just by observation that these algorithms, they don't show up randomly. They show up when there's a really difficult conversation that people want to avoid. They're like, oh, we don't know what makes a good teacher and different people have different opinions about that, so let's just bypass this conversation by having an algorithm score teachers. Or we don't know what prison is really for.
SPEAKER_02: Let's have a way of deciding how long to sentence somebody. We introduced these silver bullet mathematical algorithms because we don't want to have a conversation.
SPEAKER_05: In O'Neill's book, she writes about this young guy named Kyle Beam who takes some time off college to get treated for bipolar disorder. Once he's better and ready to go back to school, he applies for a part-time job at Kroger, which is a big grocery store chain. He has a friend who works there who offers to vouch for him. Kyle was such a good student that he figured the application would be just a formality, but he didn't get called back for an interview. His application was red lighted by the personality test he'd taken when he applied for the job. The test was part of an employee selection algorithm developed by a private workforce company called Kronos.
SPEAKER_02: 70% of job applicants in this country take personality tests before they get an interview, so this is a very common practice. Kyle had that screening and he found out because his friend worked at Kroger's that he had failed the test, so most people never find that out. They just don't hear back. The other thing that was unusual about Kyle is that his dad is a lawyer, so his dad was like, what were the questions like on this test? He said, well, some of them were a lot like the questions I got at the hospital, the mental health assessment.
SPEAKER_05: The test Kyle got at the hospital was called the five-factor model test, and it grades people on extraversion, agreeableness, conscientiousness, neuroticism, and openness to new ideas. It's used in mental health evaluations. The potential employees answers to the test are then plugged into an algorithm that decides whether the person should be hired.
SPEAKER_02: So his father was like, whoa, that's illegal under the Americans with Disability Act. So his father and he sort of figured out together that something very fishy had been going on, and his father's actually filed a class action lawsuit against Kroger's for that.
SPEAKER_05: The suit is still pending, but arguments are likely to focus on whether the personality test can be considered a medical exam. If it is, it'd be illegal under the ADA. O'Neill gets that different jobs require people with different personality types, but she says a hiring algorithm is a blunt and unregulated tool that ends up disqualifying big categories of people, which makes it a classic weapon of math destruction.
SPEAKER_02: In certain jobs, you wouldn't want neurotic people or introverted people. Like if you're at a call center where a lot of really irate customers call you up, that might be a problem. In which case, it is actually legal if you get an exception for your company. The problem is that these personality tests are not carefully designed for each business, but rather what happens is that these companies just sell the same personality test to all the businesses that will buy them.
SPEAKER_05: A lot of the algorithms that O'Neill explores in her book are largely hidden. They don't get a lot of attention. We as consumers and job applicants and employees may not even be aware that they're humming along in the background of our lives, sorting us into piles and categories. But there is one kind of algorithm that's gotten a lot of attention in the news lately.
SPEAKER_01: Is this a good or bad thing that social media has been able to infiltrate politics?
SPEAKER_04: Social media is a technology. And as we know, technologies have their good sides and the dark sides, they're not so good sides. So it all depends on...
SPEAKER_05: Towards the end of our conversation, O'Neill and I started talking about the recent election and the complex ways that social media algorithms shape the news that we receive. Facebook shows us stories and ads based on what they think we want. And of course, what they think we want is based on algorithms. These algorithms look at what we clicked on before and then feed us more content we like. The result is that we've ended up in these information silos, increasingly polarized and oblivious to what people of different political persuasions might be seeing.
SPEAKER_02: I do think this is a major problem. You know, the sky's the limit. We have built the internet and the internet is a propaganda machine. It's a propaganda delivery device, if you will. And that's not... I don't see how that's gonna stop.
SPEAKER_05: Yeah, especially if every moment is being optimized by an algorithm that's meant to manipulate your emotions.
SPEAKER_02: Right. That's exactly going back to Facebook's optimizer algorithm. That's not optimizing for truth, right? It's optimizing for profit. And they claim to be neutral, but of course nothing's neutral. And we have seen the results. We've seen what it's actually optimized for and it's not pretty.
SPEAKER_05: This kind of data-driven political micro-targeting means conspiracies and misinformation can gain surprising traction online. Stories claiming that Pope Francis endorsed Donald Trump and that Hillary Clinton sold weapons to ISIS gained millions of viewers on Facebook. Neither of those stories was true. Fixing the problem of these destructive algorithms is not going to be easy, especially when they're insinuating themselves into more and more parts of our lives. But O'Neill thinks that measurement and transparency is one place to start. Like with that Facebook algorithm and the political ads that it serves to its users. If you were to talk to Facebook about how to inject some ethics into their optimization, what would you do? Would you sort of make a case for the bottom line of truth being like a longer tail way to make more money? Or would you just say, this is about ethics and you should be thinking about ethics?
SPEAKER_02: To be honest, if I really had their attention, I would ask them to voluntarily find a space on the web to just put every political ad, and actually every ad, just have a way for journalists and people interested in the concept of the informed citizenry to go through all the ads that they have on Facebook at a given time.
SPEAKER_05: Because even if that article about Hillary Clinton and ISIS was shared thousands of times, lots of people never saw it at all.
SPEAKER_02: Just show us what you're showing other people. Because I think one of the most pernicious issues is the fact that we don't know what other people are seeing. I'm not waiting for Facebook to actually go against their interests and change their profit goal. But I do think this kind of transparency can be demanded and given.
SPEAKER_05: O'Neill also says it's important to measure the broad effects of these algorithms and to understand who they most impact.
SPEAKER_02: Everyone should start measuring it. And what I mean by that is relatively simple, and this might not be a complete start, but it's a pretty good first step, which is measure for whom this fails.
SPEAKER_05: Meaning which populations are most negatively impacted by the results of these algorithms.
SPEAKER_02: And what is the harm that befalls those people for whom it fails? And how are the failures distributed across the population? So if you see a hiring algorithm fail much more often for women than for men, that's a problem. Especially if the failure is they don't get hired when they should get hired. I really do think that a study, a close examination of the distribution of failures and the harms of those failures would really, really be a good start.
SPEAKER_05: If you're not mad enough about how algorithms influence your life, I've got a doozy for you. If you want to give your body the nutrients it craves and the energy it needs, there's Kachava. It's a plant-based super blend made up of superfoods, greens, proteins, omegas, vitamins, minerals, antioxidants, and probiotics. In other words, it's all your daily nutrients in a glass. Some folks choose to take it as the foundation of a healthy breakfast or lunch, while others lean on it as a delicious protein-packed snack to curb cravings and reduce grazing. If you're in a hurry, you can just add two scoops of Kachava super blend to ice water or your favorite milk or milk alternative and just get going. But I personally like to blend it with greens and fruit and ice. Treat yourself nice. Take a minute and treat yourself right. You'll get all the stuff that you need and feel great. Kachava is offering 10% off for a limited time. Just go to kachava.com slash invisible, spelled K-A-C-H-A-V-A and get 10% off your first order. That's K-A-C-H-A-V-A dot com slash invisible. Kachava dot com slash invisible. The International Rescue Committee works in more than 40 countries to serve people whose lives have been upended by conflict and disaster. Over 110 million people are displaced around the world, and the IRC urgently needs your help to meet this unprecedented need. The IRC aims to respond within 72 hours after an emergency strikes, and they stay as long as they are needed. Some of the IRC's most important work is addressing the inequalities facing women and girls, ensuring safety from harm, improving health outcomes, increasing access to education, improving economic well-being, and ensuring women and girls have the power to influence decisions that affect their lives. Generous people around the world give to the IRC to help families affected by humanitarian crises with emergency supplies. Your generous donation will give the IRC steady, reliable support, allowing them to continue their ongoing humanitarian efforts even as they respond to emergencies. Donate today by visiting rescue.org slash rebuild. Visit now and help refugee families in need.
SPEAKER_05: This show is sponsored by BetterHelp. Do you ever find that just as you're trying to fall asleep, your brain suddenly won't stop talking? Your thoughts are just racing around? I call this just going to bed. It basically happens every night. It turns out one great way to make those racing thoughts go away is to talk them through. Therapy gives you a place to do that so you can get out of your negative thought cycles and find some mental and emotional peace. If you're thinking of starting therapy, give BetterHelp a try. It's entirely online, designed to be convenient, flexible, and suited to your schedule. Just fill out a brief questionnaire to get matched with a licensed therapist and switch therapists at any time for no additional charge. Get a break from your thoughts with BetterHelp. Visit betterhelp.com slash invisible today to get 10% off your first month. That's BetterHelp. Visit betterhelp.com slash invisible. After these messages.
SPEAKER_03: We are currently experiencing higher call volumes than normal. Please stay on the line and an agent will be with you shortly.
SPEAKER_02: Here's one that I think is kind of fun because it's annoying and secret, but you would never know it. So if you call up a customer service line, I'm not saying this will always happen, but it will sometimes happen that your phone number will be used to backtrack who you are. You will be asked, are you a high value customer or a low value customer? And if you're a high value customer, you'll talk to a customer service representative much sooner than if you're a low value customer. You'll be held put on hold longer. That's how businesses make decisions nowadays.
SPEAKER_03: You are caller number 99. Your call is important to us. Please stay on the line.
SPEAKER_05: 99% invisible was produced this week by Delaney Hall tech and mix production by Emmett Fitzgerald. Katie Mingle is the senior producer. Kurt Kohlstedt is the digital director. Sean Rial composed all the music. The rest of the staff includes Avery Treflman, Sharife Yousif, Taron Mazza and me, Roman Mars. Special thanks to Ryan Kiesler and Courtney Riddle. We are a project of 91.7 KALW in San Francisco and produced on Radio Row in beautiful downtown Oakland California. 99% invisible is part of Radio-Topia from PRX, a collective of the best, most innovative shows in all of podcasting. We are supported by the Knight Foundation and coin carrying listeners just like you. You can find 99% invisible and join discussions about the show on Facebook. You can tweet at me at Roman Mars and the show at 99PI.org. We are on Instagram, Tumblr and Reddit too. But our lovely home on the internet with more design stories than we can ever tell you here on the radio or podcast, I guess this is a podcast, is our website at 99PI.org.
SPEAKER_05: Night sleep can be hard to come by these days and finding the right mattress feels totally overwhelming. Serta's new and improved Perfect Sleeper is a simple solution designed to support all sleep positions. With zoned comfort, memory foam and a cool to the touch cover, the Serta Perfect Sleeper means more restful nights and more rested days. Find your comfort at Serta.com.
SPEAKER_00: Gatorade Zero has all the electrolytes and all the flavor of Gatorade with zero sugar to help you get more out of your workout routine. How much more? It helps you feel more hydrated through every mile, every set and every song in your fitness routine. No matter how you choose to move, Gatorade Zero got your back from yoga to kickboxing and everything in between. Gatorade Zero is the perfect partner for whatever workout comes your way, helping you get more, do more and be more with zero. Get more out of zero.
SPEAKER_03: