Why Teams Don't Learn From Their Mistakes (And How to Change That)

In 2002, Dr. Gary S. Kaplan, the recently appointed chief executive of the Virginia Mason Health System in Seattle, visited Japan with some fellow executives. He wanted to see how organizations outside the health care sector did things.

At a Toyota plant, Kaplan had a revelation. In the automaker's well-known production system, if anyone on the production line sees an error, they send an alert that halts production across the plant. Senior executives rush over to see what's gone wrong and address the issue, and afterward the process is adapted to prevent it from happening again.

"The system was about cars, which are very different from people," Kaplan says when we meet for an interview. "But the underlying principle is transferable. If a culture is open and honest about mistakes, the entire system can learn from them." When Kaplan took this lesson back to the Virginia Mason Medical Center in Seattle, the results were astounding.


Kaplan has bright eyes and a restless curiosity. As he talks, his hands move animatedly. "We knew that medical errors cost thousands of lives across America," he recalls, "and we were determined to reduce them."

One of Kaplan's key reforms was to encourage staff to make a report whenever they spotted an error that could harm patients. It was almost identical to the reporting system in aviation and at Toyota. He instituted a 24-hour hotline as well as an online reporting system. He called them "Patient Safety Alerts."

The new system represented a huge cultural shift for staff. Mistakes were frowned upon at Virginia Mason, just like elsewhere in health care. And because of the steep hierarchy, nurses and junior doctors were fearful of reporting senior colleagues—so most didn't, and Kaplan's system went largely unused.

As Cathie Furman, who served as senior vice president for Quality, Safety, and Compliance at Virginia Mason for 14 years, later told the The Telegraph, "In health care around the world, the culture has been one of blame and hierarchy. It [can prove] very difficult to overcome that."


In November 2004, two years after Kaplan's trip to Japan, a terrible event changed everything. Mary McClinton, a 69-year-old mother of four, was inadvertently injected with a toxic antiseptic called chlorhexidine, instead of a harmless marker dye, during a brain aneurysm operation. The two substances had been placed side by side in identical stainless-steel containers, and the syringe had drawn from the wrong one. One of her legs was amputated, and she died from multiple organ failure 19 days later.

Rather than evading responsibility, Dr. Kaplan published a full and frank apology. "We just can’t say how appalled we are at ourselves," it read. "You can’t understand something you hide." McClinton's relatives welcomed the apology. It helped them understand what had happened to a beloved family member.

But the death provided something else, too: a wake‑up call for the 5,500 staff members at the medical center. "It was a tough time, but the death was like a rallying cry," Kaplan says. "It gave us the cultural push we needed to recognize how serious an issue this is."

Suddenly, Patient Safety Alerts started to fly in. Those who reported mistakes were surprised to learn that, except in situations where they'd been reckless, they were praised, not punished. Dr. Henry Otero, an oncologist, made a report after being told by a colleague that he'd failed to spot a patient's low magnesium level. "I missed it," he recounted in the same Telegraph article. "I didn’t know how I missed it. But I realized it’s not about me, it’s about the patient. The process needs to stop me making a mistake. I need to be able to say, ‘I might be the reason, fix me.’"


Today, there are around a thousand Patient Safety Alerts issued each month at Virginia Mason. A report by the U.S. Department of Health found that these have uncovered latent errors in everything from prescription to care. "After a pharmacist and nurse misinterpreted an illegible pharmacy order, leading to patient harm, the medical center developed a step‑by‑step protocol that eliminates the likelihood of such incidents occurring," the report said.

Another alert warned about wristbands: "After a newly admitted patient received a color-coded wristband signifying ‘Do Not Resuscitate’ instead of one indicating drug allergies (as a result of a nurse being color blind), the medical center added text to the wristbands."

In 2002, when Kaplan became CEO, Virginia Mason was already a competent Washington hospital. In 2013, however, it was rated as one of the safest hospitals in the world. The same year, it won the Distinguished Hospital Award for Clinical Excellence, the Outstanding Patient Experience Award, and was named a Top Hospital by the influential Leapfrog group for the eighth successive year. Since the new approach was taken, the hospital has seen a 74% reduction in liability insurance premiums.


This success is not a one-off or a fluke; it is a method. Properly instituted learning cultures have transformed the performance of hospitals around the world. Claims and lawsuits made against the University of Michigan Health System, for example,dropped from 262 in August 2001 to 83 in 2007 following the introduction of an open disclosure policy. The number of malpractice claims against the University of Illinois Medical Center fell by half in two years after the creation of an open-reporting system.

The example of the Virginia Mason system reveals a crucial truth: namely, that learning from mistakes has two components. The first is a system. Errors can be thought of as the gap between what we hoped would happen and what actually did happen. Cutting-edge organizations are always seeking to close this gap, but in order to do so they have to have a system geared to taking advantage of these learning opportunities.

This system may itself change over time: most experts are already trialing methods that they hope will surpass the Toyota Production System. But each system has a basic structure at its heart: mechanisms that guide learning and self-correction. Yet an enlightened system on its own is sometimes not enough.

Even the most beautifully constructed system won't work if professionals don't share the information it needs to flourish. In the beginning at Virginia Mason, the staff didn't file Patient Safety Alerts. They were so fearful of blame and reputational damage that they kept the information to themselves. That's perhaps the most important lesson of all: Mechanisms designed to learn from mistakes are mostly impotent if people can't first admit to the errors they make.

This post originally appeared on Fast Company