SPEAKER_10: Free Economics Radio is sponsored by American Express. The enhanced American Express Business Gold Card is designed to take your business further. It's packed with features and benefits like flexible spending capacity that adapts to your business, 24-7 support from a business card specialist trained to help with your business needs, and so much more. The Amex Business Gold Card, now smarter and more flexible. That's the powerful backing of American Express. Terms apply. Learn more at americanexpress.com slash business gold card. Free Economics Radio is sponsored by Capital One Bank. With no fees or minimums, banking with Capital One is the easiest decision in the history of decisions, even easier than deciding to listen to another episode of your favorite podcast. And with no overdraft fees, is it even a decision? That's banking reimagined. What's in your wallet? Terms apply. See Capital One dot com slash bank, Capital One N.A. member FDIC. In early 2007, Carol Hemelgarn's life was forever changed by a failure, a tragic medical failure. At the time, she was working for Pfizer, the huge U.S. pharmaceutical firm. So she was familiar with the health care system. But what changed her life wasn't a professional failure. This was personal.
SPEAKER_02: My nine year old daughter, Alyssa, was diagnosed with leukemia, A.L.L., on a Monday afternoon, and she died 10 days later. In this day and age of health care, children don't die of leukemia in nine days. She died from multiple medical errors. She got a hospital acquired infection, which we know today can be prevented. She was labeled. And when you attach labels to patients, a bias is formed. And it's often difficult to look beyond that bias. So one of the failures in my daughter's care is that she was labeled with anxiety. The young resident treating her never asked myself or her father if she was an anxious child and she wasn't. What happens is we treat anxiety, but we don't treat scared, afraid and frightened. And that's what my daughter was. Hospitals are frightening places to children. So my daughter with her hospital acquired infection became septic, but they were not treating her for the sepsis because all they could focus on is they thought she was anxious and they kept giving her drugs for anxiety. Even though the signs, the symptoms and me as her mother kept telling them something was wrong, something wasn't right, they wouldn't listen to me. So by the time, by the time she was failing so poorly and rushed to surgery and brought back out, there was nothing they could do for her. The first harm was unintentional that they did to our daughter. It was all the intentional harms after that, where we were lied to. The medical records were hidden from us. People were told not to talk to us. And the fact that it took the organization three years, seven months and 28 days to have the first honest conversation with us, those were all intentional harms. And that's why in healthcare, we have to have transparency because how many other children suffered because of the learning that didn't take place? Hemelgarn says she filed a claim against the hospital, but she didn't move forward with a
SPEAKER_10: lawsuit because of the emotional toll. She ultimately took a different path. In 2021, she co-founded an advocacy group called Patients for Patient Safety US. It is aligned with the World Health Organization. She also runs a master's program at Georgetown University, where she has been a part of clinical quality, safety and leadership. When harm does reach the patient or family, that is the time to really analyze what
SPEAKER_02: happened. And while you never want to harm a patient or family, one of the things you'll hear from patients and families after they have been harmed is they want to make sure that what happened to them or their loved one never happens again. The example I can give for myself personally is I did go back to the very organization where my daughter died and I have done work there. Today on Freakonomics Radio, we continue with the series we began last week on failure.
SPEAKER_10: We acknowledged that some failure is inevitable. We are by definition fallible human beings, each and every one of us. And that failure can be painful.
SPEAKER_09: I don't think we should enjoy failure. I think failure needs to burn on us.
SPEAKER_10: This week, we focus on the health care system where failure is literally a matter of life
SPEAKER_07: or death.
SPEAKER_10: Some organizations felt like they had already achieved the patient safety mission. Others, it wasn't even part of their strategic plan.
SPEAKER_02: And we will learn where on the next slide we will look at the failure of the patient. And we will learn where on a spectrum to place every failure from inexcusable.
SPEAKER_11: There's lots of examples of huge public sector failures, but this is one of the biggest. To life saving.
SPEAKER_12: I really believe that if we could do this, it would make a big difference in medicine.
SPEAKER_10: How to succeed at failing part two, beginning now.
SPEAKER_05: This is Freakonomics Radio, the podcast that explores the hidden side of everything with your host, Stephen Dubner.
SPEAKER_10: The story of Carol Hemelgarn's daughter is tragic. A hospital death caused by something other than the reason the patient was in the hospital. Unfortunately, that type of death is not as rare as you might think. Consider the case of Redonda Vaught, a nurse at Vanderbilt University's Medical Center. In 2019, she was prosecuted for having administered the wrong medication to a patient who subsequently died. The patient was a 75 year old woman who had been admitted to the hospital for a subdural hematoma or bleeding in the brain. Here is Redonda Vaught testifying at her trial.
SPEAKER_03: I was pulling this medication. I didn't think to double check what I thought I had pulled from the machine. I used the override function. I don't recall ever seeing any warnings that showed up on the monitor.
SPEAKER_10: The medication that Vaught meant to pull from the AccuDose machine was a sedative called Versed. What she mistakenly pulled was a paralytic called Vecuronium. Vecuronium instead of Versed.
SPEAKER_04: I won't ever be the same person. It's really. I when I started being a nurse, I told myself that I wanted to take care of people the way that I would want my grandmother to be taken care of.
SPEAKER_10: Redonda Vaught was convicted of negligent homicide and gross neglect of an impaired adult. Her sentence was three years probation. You might expect a patient safety advocate like Carol Hemelgarn to celebrate Vaught's prosecution, but she doesn't.
SPEAKER_02: This doesn't solve problems. All this does is it creates silence and barriers. When errors happen so often, the frontline workers, your nurses, allied health physicians were blamed. But what we've come to realize is it's really a systemic problem. They happen to be at the frontline, but it's underlying issues that are at the root of these problems. It can be policies that aren't the right policies. It could be shortages of staff. It can be equipment failures that are known at device companies but haven't been shared with those using the devices. It can be medication errors because of labels that look similar or drug names that are similar to get at the systemic problem in the Vanderbilt case.
SPEAKER_10: Hemelgarn's advocacy group filed a complaint with the Office of Inspector General in the Department of Health and Human Services.
SPEAKER_02: What we found most frustrating was the lack of leadership from Vanderbilt. Leadership never came out and took any responsibility. They never said anything. They never talked to the community. It was essentially silence from leadership. I think one of the other big failures we have in health care is fear. Health care is rooted in fear because of the fear of litigation. When there's a fear of litigation, silence happens. And until we flip that model, we're going to continue down this road.
SPEAKER_09: I absolutely share that worry. And that case was, in my mind, a classic case of a complex failure. Yes, there was a human error. We also had faulty medication labeling and storing practices with alphabetical organization of drugs, which is not how you do it. That's Amy Edmondson.
SPEAKER_10: We heard from her last week, too. She is an organizational psychologist at the Harvard Business School. She recently published a book called Right Kind of Wrong, The Science of Failing Well. The Vanderbilt case was not an example of failing well. Redonda Vought, you will remember, dispensed Vecuronium instead of Versed. You know, you don't have a dangerous, potentially fatal drug next to one that's routinely used in a particular procedure.
SPEAKER_09: It's what we might call an accident waiting to happen. With that perspective in mind, Redonda is as much a victim of a system failure as a perpetuator of the failure. Right. So this reaction to make this a criminal human error is almost never criminal. To criminalize this, I think, reflects an erroneous belief that by doing so, we'll preclude human error. No, what we will do is preclude speaking up about human error. And to her credit, she spoke up and that, one could argue, ultimately led her to the conviction she would have been better off somehow trying to hide it, which I wouldn't advocate, obviously. But when we recognize, deeply recognize that errors will happen, then that means that what excellence looks like is catching and correcting errors and then being forever on the lookout for vulnerabilities in our systems.
SPEAKER_10: Let's take a step back and look at the scale of this problem. In 1999, the Institute of Medicine, known today as the National Academy of Medicine, issued a report called To Error is Human Building a Safer Health System. It found that two to three percent of all U.S. hospital admissions result in preventable injury or death, with medical error causing between 44,000 and 98,000 deaths per year. A more recent study published in 2013 in the Journal of Patient Safety put the number of preventable deaths at U.S. hospitals at more like 200,000 a year. These large numbers of preventable deaths got a lot of attention in the medical community, but Carol Hemelgarn says the attention hasn't produced enough change. Some organizations felt like they had already achieved the patient safety mission.
SPEAKER_02: Others, it wasn't even part of their strategic plan. There's areas where improvement has definitely escalated since the report came out over 20 years ago. But it hasn't been fast enough. What we see is that not everything is implemented in the system, that you can oftentimes have champions that are doing this work. And if they leave, the work isn't embedded and sustainable.
SPEAKER_10: Amy Edmondson at Harvard has been doing research on medical failure for a long time, but she didn't set out to be a failure researcher.
SPEAKER_09: As an undergraduate, I studied engineering, sciences and design.
SPEAKER_10: Tell me about the first phase of your professional life, including with Buckminster Fuller.
SPEAKER_09: Yeah, so I'm answering that question with a huge smile on my face. I worked three years for Buckminster Fuller, who was an octogenarian, creative person, an inventor, a genius, a writer, a teacher, best known for the geodesic dome, which he invented, but single mindedly about how do we use design to make a better world. You can't sort of get people to change. You have to change the environment and then they'll change with it was a kind of notion that he had. My part was just doing engineering drawings and building models and doing the mathematics behind new, simpler geodesic configurations. And it was so much fun. And what was his view on failure generally?
SPEAKER_10: Oh, he was a very enthusiastic proponent of using failure to learn.
SPEAKER_09: He said often the only mistake we make is thinking we shouldn't make mistakes. He would give the example of the very first time he got a group of students together to build a geodesic dome that he had done the math. He'd come up with this idea and he got 20 students together. They're outside. They built the thing and it immediately collapsed. Okay. And he enthusiastically said, okay, that didn't work. Now what went wrong? And it was really the materials they were using, which I think the best way to describe them is Venetian blind materials. They had the tensile strength, but they certainly didn't have the compressive strength to do their job. Okay.
SPEAKER_10: And what are the steps you take to turn that failure into a useful learning, I guess is the noun we use these days?
SPEAKER_09: Immediate diagnosis, right? We just step back. Okay. What do we set out to do? What actually happened? Why might that be the case? What do we do differently next time? I mean, that's a sort of a rough outline of an after action review. It could be flawed assumptions. It could be flawed calculations. It could be any number of things. And we don't know until we put our heads together and try to figure it out.
SPEAKER_10: It was several years into her engineering career that Edmondson decided to get a PhD in organizational behavior.
SPEAKER_09: I was interested in learning in organizations and I got invited to be a member of a large team studying medication errors in hospitals. And the reason I said yes was first of all, I was a first year graduate student. I needed to do something. And second of all, I saw a very obvious link between mistakes and learning. And so I thought, here we've got these really smart people who will be identifying mistakes and then I can look at how do people learn from them and how easy is it and how hard is it. So that's how I got in there. And then one thing led to another. After doing that study, people kept inviting me back.
SPEAKER_10: I see. She loves failure. They say.
SPEAKER_09: That's right.
SPEAKER_10: Edmondson focused her research on what are called preventable adverse drug events, like the one from the Redonda Vaught case.
SPEAKER_09: Now you can divide adverse drug events into two categories, one which is related to some kind of human error or system breakdown and the other which is a previously unknown allergy. So literally couldn't have been predicted. And those are still adverse drug events, but they're not called preventable adverse drug events.
SPEAKER_10: But within the first category, there's probably 10 subcategories at least. Right. There's bad data entry, bad handwriting, right? Wrong eyeglasses. On and on it goes.
SPEAKER_09: Yeah. Or, you know, using language badly so that people didn't understand what you said and they didn't feel safe asking. My wife had a knee surgery, easy knee surgery in the painkiller that they prescribed on the spot.
SPEAKER_10: The doc actually stood there and wrote it was for 100X the dosage. Oh, no, no. Yeah. Yeah. See, that's an error driven preventable adverse drug event.
SPEAKER_09: Yes, I agree.
SPEAKER_09: You know, there will always be things that go wrong or at least not the way we wanted them to. And my observation in studying teams in a variety of industries and settings was that responses to failure were rather uniform, inappropriately uniform. The natural response and even the formal response was to find the culprit as if there was a culprit and either discipline or retrain or, you know, shame and blame the culprit. And it wasn't a very effective solution because the only way to prevent those kinds of system breakdowns is to be highly vigilant to how little things can line up and produce failures.
SPEAKER_10: Based on what she was learning from medical mistakes, Edmondson wanted to come up with a more general theory of failure or if not a theory, at least a way to think about it more systematically to remove some of the blame, to make the responses to failure less uniform. Over time, she produced what she calls here. Let's have Edmondson say it.
SPEAKER_09: My spectrum of causes of failures.
SPEAKER_10: After the break, we will hear about that spectrum of causes of failures. It can clarify some things, but not everything. Uncertainty is everywhere. I'm Stephen Dubner and you were listening to Freakonomics Radio. By the way, if you consider yourself a super fan of this show, we have just launched a new membership program, Freakonomics Radio Plus. Every week, members get a bonus episode of Freakonomics Radio. You also get to listen ad free to this show and the other shows in the Freakonomics Radio network. To sign up, visit the Freakonomics Radio Show page on Apple Podcasts or go to Freakonomics.com. Plus, if you don't want to become a member, just do nothing. Everything will stay the same. We will be right back with how to succeed at failing. Freakonomics Radio is sponsored by Lenovo. Looking for a go anywhere, do anything laptop with the flexibility you need? Check out the Lenovo Yoga 9i designed on Intel Evo. You can connect your phone to the laptop even if you've got an iPhone. Transfer seamlessly from your phone to your PC or PC to your phone with Intel Unison. Make calls, send texts, transfer photos and data and more. Plus, enjoy the ultimate viewing experience with a 4K resolution and OLED PureSight display. The Lenovo Yoga 9i also comes with one-click function keys, a full-size stylus pen, touchscreen functionality and plenty of ports. The intelligent collaboration feature even lets you communicate seamlessly across teams, Zoom or other apps and get up to 14 hours of battery life and rapid charging abilities. Lenovo Yoga, an Intel Evo laptop engineered to do it all. Available at BestBuy.com slash yoga. Freakonomics Radio is sponsored by Canva. If you need to design visuals for a brand, the online design platform Canva makes it easy. With Canva, you can keep your brand's fonts, colors, logos and graphics right where you design presentations, websites, videos and more. Create brand templates to give anyone on your team a design head start. Drag and drop your logo into a website design or click to get your social post colors on brand. With just a few clicks, you can save time resizing social posts. If you're a designer, Canva saves you time on repetitive design tasks. No design resource, just design it yourself. With Canva, you don't need to be a designer to design visuals that stand out and stay on brand. Start designing your own design. And stay on brand. Start designing today at Canva.com, the home for every brand. How did Amy Edmondson become so driven to study failure? Well, here's one path to it. Her whole life, she'd been a straight A student.
SPEAKER_09: Right. I never had an A minus. Well, you know, I once had one in 10th grade. It just was so devastating. I resolved not to have one again. And I'm only partly joking. But then she went to college. I got an F on my first semester multivariable calculus exam. An F. Like I failed the exam. I mean, that's unheard of. What'd that feel like? I didn't see it coming, but I wasn't baffled after the fact. After the fact, it was very clear to me that I hadn't studied enough.
SPEAKER_10: In the years since then, Edmondson has been refining what she calls a spectrum of causes of failure. The spectrum ranges from blameworthy to praiseworthy, and it contains six distinct categories of failure. Let's take two extremes.
SPEAKER_09: Let's say something goes wrong. We achieve an undesired result. On one end of the spectrum, it's sabotage. Someone literally tanked the process. They threw a wrench into the works. On the other end of the spectrum, we have a scientist or an engineer hypothesizing some new tweak that might solve a really important problem. And they try it and it fails. And of course, we praise the scientist and we punish the saboteur. But the gradations in between often lull us into a false sense that it's blameworthy all the way. OK, so let's start at the blameworthy end of the spectrum and move our way along.
SPEAKER_10: Number one of the six. My spectrum of causes of failures starts with sabotage or deviance.
SPEAKER_09: I soak a rag in lighter fluid, set it on fire, throw it into a building. Right. Or I'm a physician in a hospital, a surgeon, and I come to work drunk and do an operation.
SPEAKER_10: You describe this as the individual chooses to violate a prescribed process or practice. Now, I could imagine there are some cases where people violate because they think that the process is wrong.
SPEAKER_09: That's right. There has to be intent here to label something a true sabotage. It has to be my intent is to break something. It's not a mistake and it's not a thoughtful experiment. There certainly are protocols in hospitals, for example, where thoughtful physicians will deliberately depart from the protocol because their clinical judgment suggests that would be better. They may be right. They may be wrong, but that would not qualify as a blameworthy act.
SPEAKER_10: After sabotage on the spectrum comes inattention. Inattention is when something goes wrong because you just were mailing it in, you spaced out.
SPEAKER_09: You didn't hear what someone said and you didn't ask and then you just tried to wing it. Or you maybe are driving your trucker and you're driving and you look away or fiddle with the radio and have a car crash.
SPEAKER_10: Now, it sounds like those are mostly blameworthy. But what about inattention caused by external factors?
SPEAKER_09: Well, that's exactly right. Once we leave sabotage and move to the right in the spectrum, it will never be immediately obvious whether something's blameworthy or not. It's always going to need further analysis. So when we say the failure was caused by someone not paying attention, that just brings up more questions. OK, why weren't they paying attention? Now, it could be that this poor nurse was on a double shift and that is not necessarily the nurse's fault. It might be the nurse manager who assigned that double shift or it might be the fact that someone else didn't show up. And so they have to just do it. And they're quite literally too tired to pay attention fully. So we always want to say, well, wait, let's see, what are the other contributing factors to this inattention? Can you think of a large scale failure, a corporate or institutional failure that was caused largely by inattention?
SPEAKER_09: Yes. One that comes to mind is a devastating collapse with the loss of many lives when a Hyatt Regency atrium collapsed in Kansas City in the early 80s. And the inattention there was the engineer on records failure to pay close attention when the builder decided out loud, not hidden, to swap one long beam for two smaller connected steel beams. It would have been a five minute calculation to show that won't work with the loads that were expected. It was a change that didn't obtain the attention it needed to have avoided this catastrophic failure.
SPEAKER_10: And was that change done to save money or was it even more benign than that?
SPEAKER_09: I think it was a combination of speed and money. Speed is money.
SPEAKER_10: Wow, wow, wow, wow. That's a great example. OK, let's go to the next one. Inability. I'm reading one version of your spectrum here, which describes this as the individual lacks the knowledge, attitudes, skills or perceptions required to execute a task. That's quite a portfolio of potential failure.
SPEAKER_09: That's right. And that spans from a young child who doesn't yet know how to ride a bicycle. So as soon as they hop on that bicycle, they're going to fall off because they don't have the ability yet to, you know, multivariable calculus, which at least when you're not studying hard enough, you don't have the ability. So it's something that you just don't have the ability to do to success, but usually could develop.
SPEAKER_10: This reminds me of the Peter principle, where people get promoted to a position higher than they're capable based on their past experience. But their past experience may not have been so relevant to this.
SPEAKER_09: That's a great connection. Yeah. The Peter principle where the failure gets caused by the fact that you don't have the ability to do the new role, but no one really paused to reflect on that.
SPEAKER_10: I sometimes think about this in the political realm, too. The ability to get elected and the ability to govern effectively seem to be almost uncorrelated to me. I'm sorry to say, do you think that's the case? And do you apply this spectrum sometimes to the political realm? I don't think it was always the case, but I think it might be increasingly the case.
SPEAKER_09: There's no theoretical reason why the two abilities to be compelling and win people over to your point of view should be at odds with the capability to do it. But the way it is increasingly set up in our society might be putting them at odds.
SPEAKER_10: After inability comes what Edmondson calls task challenge.
SPEAKER_09: Yes, the task is too challenging for reliable, failure-free performance. Example? A great example is an Olympic gymnast who is training all the time and is able to do some of the most challenging maneuvers, but will not do them 100 percent of the time. And so when that person experiences a failure, they trip during their routine, then we would call that a failure that was largely caused by the inherent challenge of the task.
SPEAKER_10: Can you give an example in either the corporate or maybe academic realm?
SPEAKER_09: Let's go to NASA, for example. The shuttle program is very, very challenging. I think we can all agree to that. And over time, they started to think of it as not challenging. But really, it's a remarkably challenging thing to send a rocket into space and bring it back safely. Kind of paradoxical then that the thing was actually called Challenger.
SPEAKER_09: That's a good point. Actually, I love Richard Feynman looking back on the Challenger accident, his sort of simple willingness to just put the piece of O-ring in the ice water, see what happens, right? That's something that in a better run, more psychologically safe, more creative, generative work environment, someone else would have done in real time.
SPEAKER_10: But, you know, if I recall correctly, even though he was on that commission to investigate, they tried to essentially shut him up. They didn't want that news coming out at the hearing. They wanted, you know, they didn't want the failure to be so explicit. That's right. But that's I mean, that's not a good thing.
SPEAKER_09: That's not a good thing. You've got to learn from it so that it doesn't happen again.
SPEAKER_10: By the way, if you don't remember the story of Richard Feynman and the Challenger investigation and the O-rings, don't worry. We are working on a show about Feynman that you will hear in the coming months. OK, back to failure. The fifth cause of failure on Amy Edmondson's spectrum is uncertainty.
SPEAKER_09: So uncertainty is everywhere. There's probably, you know, an infinite number of examples here, but let me pick a silly one. A friend sets you up on a blind date and you like the friend and you think, OK, sure. And then you go out on the date and it's a terrible bore or worse. It's a failure. But you you couldn't have known in advance. It was uncertain. How about a less silly example? You're in a company setting. You have an idea for a strategic shift or a product that you could launch. And there's very good reasons to believe this could work, but it's not 100 percent. The final cause of failure we have by now moved all the way from the blameworthy end of the spectrum to the praiseworthy is simply called experimentation.
SPEAKER_09: I'm being fairly formal when I say experimentation. The most obvious example is a scientist in a lab and probably really believes it will work and puts the chemicals in and lo and behold, it fails. Or in much smaller scale, I'm going to experiment with being more assertive in my next meeting and doesn't quite work out the way I'd hoped. It's the Edison quote, you know, 10,000 ways that didn't work. He's perfectly, perfectly willing to share that because he's proud of each and every one of those 10,000 experiments.
SPEAKER_10: So that is Amy Edmondson's entire spectrum of the causes of failure, sabotage, inattention, inability, task challenge, uncertainty and experimentation. If you're like me, as you hear each of the categories, you automatically try to match them up with specific failures of your own. If nothing else, you may find that thinking about failure on a spectrum from blameworthy to praiseworthy is more useful than the standard blaming and shaming. It may even make you less afraid of failure. That said, not everyone is a fan of Edmondson's ethos of embracing failure. A research article by Jeffrey Ray at the University of Maryland, Baltimore County is called Dispelling the Myth that Organizations Learn from Failure. He writes, failure shouldn't even be in a firm's vocabulary. To learn from failure or otherwise, a firm must have an organizational learning capability. If the firm has the learning capability in the first instance, why not apply it at the beginning of a project to prevent a failure rather than waiting for a failure to occur and then reacting to it? But Amy Edmondson's failure spectrum has been winning admirers, including Gary Klein, the research psychologist best known as the pioneer of naturalistic decision making.
SPEAKER_07: I'm very impressed by it. I'm impressed because it's sophisticated. It's not simplistic. There's a variety of levels and a variety of reasons. And before we start making policies about what to do about failure, we need to look at things like her spectrum and identify what kind of a failure is it so that we can formulate a more effective strategy.
SPEAKER_10: OK, let's do that. After the break, two case studies of failure, one of them toward the blameworthy end of the spectrum.
SPEAKER_11: It was very much driven by the prime minister, Tony Blair. The other quite praiseworthy.
SPEAKER_12: I failed over 200 times before I finally got something to work.
SPEAKER_10: I'm Stephen Dubner. This is Freakonomics Radio. We'll be right back. Freakonomics Radio is sponsored by Delta Airlines. No one sees the world quite like an airline, but the world seen below is changing fast. That's why Delta Airlines is committed to net zero carbon emissions by 2050. It's why they are partnering across the industry to create the future of aviation and switching ground vehicle after ground vehicle to electric. It's why Delta is accelerating their push to source sustainable aviation fuel. And it's why they're also getting rid of four point nine million pounds of single use plastics annually. That's just what they are doing. But it'll take more than just the 90,000 people at Delta. It'll take everyone at every airline and in every industry because Delta knows a thing or two about getting where it needs to be. And they believe it's time we all got going together. Learn more at Delta dot com slash sustainability. Freakonomics Radio is sponsored by Amazon. This holiday season, Amazon has low prices on gifts and millions of everyday items to help you take your budget further. Amazon will have the lowest prices of the season on the hottest gifts, latest gadgets and most wanted gear. Amazon makes your budget go further. So why not shop for yourself while shopping for others, too? With Amazon's low prices and wide selection on all of your needs, it's a one stop shop for the holiday season.
SPEAKER_04: Shop early holiday deals at Amazon dot com slash holiday deals. Twenty three. John Van Rienen is a professor at the London School of Economics.
SPEAKER_10: He studies innovation. But years ago, he did some time in the British Civil Service. I spent a year of my life working in the Department of Health when there was a big expansion in the UK National Health Service of resources and various attempts at reforms.
SPEAKER_11: The National Health Service is the UK's publicly funded health care system, and there was one of the most successful health care systems in the UK.
SPEAKER_10: One of the key things that was thought could really be a game changer was to have electronic patient records.
SPEAKER_11: So you can see the history of patients, the conditions, what they've been treated with. And having that information, I mean, instead of having all this pieces of paper written illegibly by different positions, you could actually have this in a single record would not only make it much easier to find what was going on with patients, but also to have a lot of information that was available to them. And it would make it much easier to find what was going on with patients, but could also be used as a data source to try and help think about how patients are more joined up care and could even maybe predict what kind of conditions they might have in the future.
SPEAKER_10: The project was called Connecting for Health, and there was substantial enthusiasm for it. At least the ad campaign was enthusiastic.
SPEAKER_01: All this is a key element in the future of the NHS. Today, not too far away, you'll wonder how you live without it.
SPEAKER_11: It was very much driven by the Prime Minister, Tony Blair. This was a centralized top down approach in order to have a single IT system where you could access information. Instead of having all these different IT systems, these different siloed pieces of paper to have it in one consistent national system. The NHS is a big operation, one of the biggest employers in the world.
SPEAKER_11: I think the ranking goes something like the US Department of Defense, the Indian Railway System, Walmart, and the NHS is up there in the top five. But then if you drill down into it, it is pretty fragmented. Each local general practitioner unit is self-employed. Each trust has a lot of autonomy. And that's part of the issue is that, you know, this was a centralized top down program in a system where there's a lot of different fiefdoms, a lot of different pockets of power who are quite capable of resisting this and disliked very strongly being told, this is what you're going to have, this is what you're going to do without really being engaged and consulted properly.
SPEAKER_10: But the train rolled on despite these potential problems. Connecting for health required a massive overhaul of hardware systems as well as software systems.
SPEAKER_11: And the delivery of those was there was a guy called Richard Grainger, who was brought in and he was the highest paid public servant in the country. He was at Deloitte's before he came. And then after he left, he went to work for Accenture. He was brought in to do this. He designed these contracts, very tough contracts, which loaded the risk of things going wrong very strongly onto the private sector providers. I think just about every single quote unquote winner eventually either went bankrupt or walked away from the contract. The estimates vary at the cost of this, but, you know, estimates are up to 20 billion dollars lost on this project. It was the biggest civilian IT project in the Western world. I mean, there's lots of examples of huge public sector failures and private sector failures as well. But this was one of the biggest. British Parliament ultimately called this attempted reform, quote, one of the worst and most expensive contracting fiascos ever.
SPEAKER_10: So what kind of lessons can be learned from this failure?
SPEAKER_11: I think it's a failure of many, many different causes on many different levels. That top down less, not really understanding what was going on at a grassroots level and the haste was attempted them very quickly. I've read that the haste, especially the haste of awarding contracts at the time, was considered a great thing because it was so atypical of how government worked and it was hailed as, you know, a new way of the government doing business.
SPEAKER_10: In the end, that haste turned out to be a problem, though, correct? Correct.
SPEAKER_11: I mean, it seemed at the time when these contracts were formed, the government was getting a good deal and they were doing it quickly. They were loading the risks onto the suppliers. So it wasn't obvious from the get go that this was going to be as bad as it turned out to be. Looking back, trying to do things quickly in such a complicated system, there was so much complexity that a lot of these contracts effectively had to be rewritten afterwards. And I think, you know, another general lesson is that when you're doing a long term, important big contract, you can't get everything written down quickly. There has to be a lot of give and take. It's a kind of relationship that you have to adjust as things go. Contracts are very fuzzy, they're very incomplete. You just have to accept that, that you're going to have to not get things right, but not try to do everything really, really, really quickly. An IT project is never just about IT. It's also about the way you change a whole organization and to do it, it's not just about spending money. You also have to get players in that system on board because it's very difficult to just get them to do things, especially, you know, in a public system where you can't just fire people. If you want to fire them, you really have to have a culture of kind of bringing people on board if you want to make these type of changes. And that just didn't happen. So I don't think it's just one thing you could think of. There's the haste, there's the design which worked out badly, and this is in the cultural aspects that we've talked about. When you're trying to innovate, you want to have a way of allowing people to take risks and do things wrong. But then you also have to have feedback mechanisms to figure out, well, you know, what has gone wrong? So creating an attitude of saying, well, we actually don't know what the right thing to do is. So we're prepared to do experimentations and learn from that.
SPEAKER_10: If you are the kind of person who likes to understand and analyze failure in order to mitigate future failures, what might be useful here is to overlay the National Health Service's IT fiasco onto Amy Edmondson's spectrum of causes of failure. Reconfiguring a huge IT system certainly qualifies as a task challenge. But there were shades of inability and inattention at work here as well. All of those causes reside toward the blameworthy end of the scale. As for the praiseworthy end of the spectrum, that's where experimentation can be found. The NHS project didn't incorporate much experimentation. It was more command and control top down with little room for adjustment and little opportunity to learn from the small failures that experimentation can produce and which can prevent big failures. Experimentation, if you think about it, is the foundation of just about all the learning we do as humans. And yet we seem to constantly forget this. Maybe that's because experimentation will inevitably produce a lot of failure. I mean, that's the point. And most of us just don't want to fail at all, even if it's in the service of long term success. So let's see if we can't adjust our focus here. Let's talk about real experimentation. And for that, we will need not another social scientist like John Van Rienen or Amy Edmondson, as capable as they are, but an actual science scientist. Here is one of the most acclaimed scientists of the modern era.
SPEAKER_12: My name's Bob Langer and I'm an Institute professor at MIT. I do research, but I've also been involved in helping get companies started and I've done various advising to the government, FDA and places like that.
SPEAKER_10: And if I say to you, Bob, what kind of scientist are you exactly? How do you answer that question?
SPEAKER_12: Well, I would say I'm a chemical engineer or a biomedical engineer, but people have called me all kinds of things. You know, they've called me a biochemist. We do very interdisciplinary work, so I end up getting called more than one thing. Do you care what people call you? I just like them to call me Bob.
SPEAKER_10: Langer holds more than fourteen hundred patents, including those that are pending. He runs the world's largest biomedical engineering lab at MIT, and he is one of the world's most highly cited biotech researchers. He also played a role in the founding of dozens of biotech firms, including Moderna, which produced one of the most effective covid vaccines. One thing Langer is particularly known for is drug delivery that is developing and refining how a given drug is delivered and absorbed at the cellular level. A time released drug, for instance, is the sort of thing we take for granted today, but it took a while to get there. One problem Langer worked on back in the 1970s was finding a drug delivery system that would prevent the abnormal growth of blood vessels. The chemical that inhibits the growth is quite large by biological standards, and there was consensus at the time that a time release wouldn't work on large molecules. But as Langer once put it, I didn't know you couldn't do it because I hadn't read the literature. So he ran experiment after experiment after experiment before finally developing a recipe that worked. Decades later, thanks to all that failure, his discovery played a key role in helping him to develop a recipe that worked. And that's how Moderna used messenger RNA to create its covid vaccine. So in your line of work, when I say the word failure, what comes to mind?
SPEAKER_12: Well, I mean, a lot of things, but I go back to my own career. I failed at trying to get research grants. My first nine research grants were turned down. I sent them to places like National Institutes of Health and they have study sections reviewers. Mine would go just because of the work I was doing to what was called a pathology B study section. And they would review it. And they said, well, Dr. Langer, you know, he's an engineer. He doesn't know anything about biology or cancer. I failed over and over again. Other things like I failed to get a job in a chemical engineering department as an assistant professor, even nobody would hire me. They said actually the opposite. They said, you know, chemical engineers don't do experimental biomedical engineering work, so, you know, they should work on oil or energy. When I first started working on creating these micro nanoparticles to try to get large molecules to be delivered, I failed over 200 times. I mean, before I finally got something to work, I could go on and on in my failures.
SPEAKER_10: What kept you going during all this failure?
SPEAKER_12: I really believe that if we could do this, it would make a big difference in science. And I hoped a big difference in medicine. Secondly, as I did some of it, you know, I could see some of these results with my own eyes. You know, when we were trying to deliver some of these molecules to stop blood vessel growth, I could see we were doing this double blind, but I could still see that we were stopping the vessels from growing. That's such a visual thing. And I also developed these ways of studying delivery out of the little particles by putting certain enzymes in them and putting dyes in a little gel that would turn color if the enzymes came out. And I could see that happening. Like I said, the first 200 times or first 200 designs or more, it didn't happen. But finally, I came up with a way where I'd see it come out after an hour, after two hours, after a day, after a second day, up to over 100 days, some cases. So I could see with my own eyes this was working. So that made an enormous difference to me, too. But failing 200 times costs a lot of money and obviously a lot of time.
SPEAKER_10: Did you ever almost run out of one or the other?
SPEAKER_12: The experiments I was doing weren't that expensive, especially the delivery ones initially, because they were in test tubes. I worked probably 20 hour days. And so the expense wasn't that great. And I've always been good at manufacturing time. Now, let's say someone is in a similar situation today to where you were then with an idea or a set of ideas that they believe in, that they think they are right about.
SPEAKER_10: They think it's an important idea, and yet they are failing and failing to get the attention of the people who can help manufacture success. How do you think about the line? I think of it sometimes as a line between grit and quit, right? Economists talk about opportunity cost. Every hour you spend on something that isn't working is an hour you could spend on something that is working. But then psychologists talk about grittiness and how useful it can be to stick things out. Do you have anything to say to people who might be wrestling with that?
SPEAKER_12: Well, I think it's a great question, and I ultimately think it's a judgment call and we can never be sure of our judgment. You like to try to think, are these things scientifically possible? I think that's one thing. Secondly, it's good to get advice from people. That doesn't mean you have to take it, but it's good to get advice. I certainly personally have always erred on the side of, I guess, not quitting. And maybe that's sometimes a mistake. I don't think so. I think it depends on what could happen if you are successful. You know, if you are successful, could it make a giant difference in the world? Could it help science a lot? Could it help patients' lives a lot? And so if you really feel that it can, you try that much harder. If it's incremental, sure, then it's much easier to quit. Is that ability to persevere within yourself, at least, do you think that's your natural temperament?
SPEAKER_10: Is that something you learned? Did you find incentives to lead you there? I think for me, there are a couple of things.
SPEAKER_12: One, I guess I've always been very stubborn. My parents told me that. But secondly, I think there's a whole thing with role models, too. When I was a postdoc, the man that I worked with, Judah Folkman, he experienced the same thing. He had this theory that if you could stop blood vessels, you could stop cancer. And that was mediated by chemical signals. And everyone told him he was wrong, but I would watch him every day. And he believed anything was possible. And he kept sticking to it. And of course, eventually he was right. I think seeing his example probably also had a big effect on me.
SPEAKER_10: Can you talk to me about how scientific failure is treated generally? Let's assume a spectrum. And on one end of the spectrum is that every failure is written up and published and perhaps even celebrated as having discovered a definitive wrong path to pursue. So everybody coming after you can cross that off their list. And on the other end of the spectrum, every failure is hidden away, which allows many other people to make the same failure. Can you talk about where the reality is?
SPEAKER_12: I think that's an interesting question. A lot of it even depends how you define failure. You know, when you're trying to learn about something, you try different things and embedded in the scientific papers we write. Like when we wrote this paper in Nature in 1976, which was the first time you could get small particles to release large molecules from biocompatible materials. Well, some of the materials we use failed. A lot of them did, actually, because they would either cause inflammation or the drug would come out way too fast or not come out at all. We found one fraction that worked and stop blood vessels and probably 50 or 100 that didn't. So the failures and successes are maybe in the same papers sometimes. What I've tried to do, even to give more detail, is you put all the data in, even if it makes for a very long thesis. So not only are the graphs there and the papers, but there's even the raw data that people can look at and analyze. And I try to get people to do as much of that as possible. So I guess what I'm trying to say is that the failures and successes are almost intertwined. I'd like to hear you talk about how failure is discussed or thought of in the lab.
SPEAKER_10: Maybe it's nothing overt, but I am curious, especially when you bring in young people, researchers, whether they're postdoc or undergrad, do you give pep talks about failure? Do you kind of have a philosophy that you want to instill in these people that failure is an essential component of research and success?
SPEAKER_12: Yes, yes. And I do. And I, whether it's my own talks or just meeting with students and brainstorming with them about those things. But to me, that research, scientific research, I mean, you just fail way more than, at least I do, way more than you succeed. It's just part of the process. I mean, that's experimentation and that's OK. A lot of your colleagues and students go on to start companies, and that's a whole different ball of wax.
SPEAKER_10: How do you think about failure in the entrepreneurial process?
SPEAKER_12: Obviously, the easy criteria is a successful company having a good financial exit, I suppose. But I don't necessarily think of it as just that way. I mean, that's certainly going to be important. You know, I've been involved in things where you've advanced science and you learn some things and there's degrees of success. You just don't know. I've been pretty fortunate in the companies we've started in terms of the exits that they've had. But I just think there's no simple criteria. I feel like we've turned out a lot of great scientists and entrepreneurs and not all their companies have had great financial exits. But I think they've also created products that can change people's lives. And that to me is also very, very important, obviously. That's why we do it in the first place. I have never done it for money and I don't think they do it for money. They do it to try to make a difference in the world. Do you think failure is, however, a different animal in the research sphere as in the entrepreneurial sphere?
SPEAKER_12: I would say yes, I think it is. But I also think, you know, there's different cultures, too. I think the good thing about the United States culture, maybe in contrast to some cultures, is failure is widely accepted. I'll give you one of my examples actually in the business sphere. So I'm a big fan of chocolate. Of eating it or making it or researching it? Probably any part, but mostly eating it. But at any rate, one of the books I read and I'm actually not a fan of their chocolate is a book on Milton Hershey. And so this really gets to your point on failure. Milton Hershey, he had this idea when he was young, very young, of starting a candy company. And I remember the first candy company, he went bankrupt, you know, and he tried to raise more money, started another one. I think like the first six or seven totally failed, but not the last one, obviously. And he became a millionaire at a time when there weren't very many. Was that really failure or was it just being an apprentice to trying to learn how to succeed? And I think that's true in a lot of things. The reason I brought it up is I don't think there's a shame and failure in either area or I hope there's not. I think you have to feel it's OK and then you keep going on.
SPEAKER_10: What do you think? Would you like to live in a world where there's no shame in failure? Or do you think it's important for failure to hurt, to burn, as one of our guests put it last week? Maybe that creates a stronger incentive to succeed. I'd love to know your thoughts on this question and on this series so far. Send an email to radio at Freakonomics.com or you can leave a review or rating in your podcast app. Coming up next time on the show, we will dig deeper into the idea of grit versus quit. When you're failing, how do you know if it's time to move on? We just could not stop it from leaking.
SPEAKER_06: And I was no longer willing to just keep pouring more and more of my money into it. He dumped me when I was 70 and I married him again at age 75.
SPEAKER_00: You know, hope springs eternal. This is a great idea.
SPEAKER_05: You just have to raise a quarter million dollars. Case studies in failure and in grit versus quit, including stories from you, our listeners.
SPEAKER_10: That's next time on the show. Until then, take care of yourself. And if you can, someone else too. And remember to check out Freakonomics Radio Plus if you want even more Freakonomics Radio. Every week you will get a bonus episode of the show. This week, for instance, you will hear our full interview with the remarkable Bob Langer. To sign up, visit the Freakonomics Radio Show page on Apple Podcasts or go to Freakonomics.com. Plus, Freakonomics Radio is produced by Stitcher Media. You can find our entire archive on any podcast app or at Freakonomics.com, where we also publish transcripts and show notes. This episode was produced by Zach Lipinski and mixed by Eleanor Osborn with help from Jeremy Johnston. Our staff also includes Alina Cullman, Elsa Hernandez, Gabriel Roth, Greg Rippon, Jasmine Klinger, Julie Kanfer, Lyric Bowditch, Morgan Levy, Neil Carruth, Rebecca Lee Douglas, Ryan Kelly and Sara Lilly. Our theme song is Mr. Fortune by the Hitchhikers. The rest of our music is composed by Luis Guerra. As always, thank you for listening. The conversation that we had casually last year was a great conversation. If we can essentially do something similar, that'll be fantastic for our listeners. I'll try to remember what I said.
SPEAKER_05: The Freakonomics Radio Network, the hidden side of everything.
SPEAKER_02: Stitcher.
SPEAKER_10: Freakonomics Radio is sponsored by Flexport. Multiple logistics vendors, delayed shipments, extensive paperwork. It's enough to drive an entrepreneur mad. Flexport is here to help with a game changing self-service end to end AI driven supply chain solution for finance, freight and fulfillment. Flexport can automate your supply chain so you can spend more time growing your business. Visit flexport.com slash revolution to learn how Flexport can help you and get 90 days of their premium Flexport Plus service for free. Flexport ship anywhere, sell everywhere, grow faster.
SPEAKER_08: Amika is a different type of insurance company. We provide you with something more than auto, home or life insurance. It's empathy because at Amika your coverage always comes with compassion. It's one of the reasons why 98% of our customers stay with us every year. Amika. Empathy is our best policy.
SPEAKER_13: Get ready for an unforgettable journey as Netflix unveils All the Light We Cannot See. Adapted from the Pulitzer Prize winning novel, All the Light We Cannot See is a breathtaking tale of hope, human connection and action. We follow the lives of Marie-Laure and Werner, who share a secret connection that will become a beacon of light that leads them through the harrowing backdrop of World War II. Directed by Shawn Levy with an exceptional cast, including Mark Ruffalo and Hugh Laurie. Watch All the Light We Cannot See. Watch All the Light We Cannot See. November 2nd. Only on Netflix.