Advertisement
Podcast Commentary

The Impact of Criminal Conviction in Health Care

John W. Harrington, MD
© 2023 HMP Global. All Rights Reserved.
Any views and opinions expressed are those of the author(s) and/or participants and do not necessarily reflect the views, policy, or position of Consultant360 or HMP Global, their employees, and affiliates. 

​​​​​​In this episode, John W. Harrington, MD, speaks about a former nurse who was found guilty of the accidental injection death of a 75-year-old patient. Dr Harrington speaks about why the criminally negligent homicide and gross neglect of an impaired adult convictions are controversial and how the precedent could complicate efforts to recruit and retain health care professionals.

Additional Resources:

  • Kohn LT, Corrigan JM, Donaldson MS, Institute of Medicine (US) Committee on Quality of Health Care in America. To err is human: building a safer health system. Washington (DC): National Academies Press (US); 2000. doi:10.17226/9728
  • Institute of Medicine (US) Committee on Quality of Health Care in America. Crossing the quality chasm: a new health system for the 21st century. Washington (DC): National Academies Press (US); 2001. doi:10.17226/10027
  • Rigamonti D, Rigamonti KH. Achieving and maintaining safety in healthcare requires unwavering institutional and individual commitments. Cureus. 2021;13(2):e13192. doi:10.7759/cureus.13192
  • Martin-Delgado J, Martínez-García A, Aranaz JM, Valencia-Martín JL, Mira JJ. How much of root cause analysis translates into improved patient safety: a systematic review. Med Princ Pract. 2020;29(6):524-531. doi:10.1159/000508677
  • Wolf ZR, Hughes RG. Best practices to decrease infusion-associated medication errors. J Infus Nurs. 2019;42(4):183-192. doi:10.1097/nan.0000000000000329
  • Quillivan RR, Burlison JD, Browne EK, Scott SD, Hoffman JM. Patient safety culture and the second victim phenomenon: connecting culture to staff distress in nurses. Jt Comm J Qual Patient Saf. doi:2016;42(8):377-86
  • Lee SE, Dahinten VS. Psychological safety as a mediator of the relationship between inclusive leadership and nurse voice behaviors and error reporting. J Nurs Scholarsh. 2021;53(6):737-745. doi:10.1111/jnu.12689

John W. Harrington, MD, is the vice president of quality, safety, and clinical integration at Children’s Hospital of The King’s Daughters and the division director of General Academic Pediatrics (Norfolk, VA).


 

TRANSCRIPTION:

Jessica Bard: Hello, everyone. Welcome to another installment of Podcast 360, your go-to resource for medical news and clinical updates. I'm your moderator, Jessica Bard, with Consultant360 Specialty Network.

A former nurse was found guilty in an accidental injection death of a 75-year-old patient. Dr John Harrington is here to speak with us about the controversial conviction.

Dr Harrington is the vice president of quality, safety, and clinical integration at Children's Hospital of the King's Daughters and the division director of general academic pediatrics in Norfolk, Virginia.

Thank you for joining us today, Dr Harrington. Please give us a summary of the events and the legal case.

Dr John Harrington: Thank you, Jessica, for a chance to speak up. I think is this an important topic. This was an interesting case that appeared on everybody's radar, especially nursing.

It was related to a nurse. RaDonda Vaught is her name. She's out in Tennessee. In 2017, so this happened about 5 years ago. She mistakenly gave vecuronium, which is a paralytic, from Versed, which is something to calm someone down.

She went into a medicine cabinet that was at the MRI station that she was at and apparently took the wrong medication, and then gave it to this 75-year-old woman, who then went into a paralytic process where she couldn't breathe. The thought process is, is that caused her demise and brain death.

What happened though, after that, was it wasn't a root-cause analysis process. It was more a criminal process that went on. In 2019, she was arrested.

That case finally came to light in Nashville, Tennessee. They actually finished the court case and she was convicted. She was convicted of negligent homicide and could spend up to 3 to 6 years in prison.

This is really something that, in our systems that we have, we protect against this in a sense that, if it's a mistake or it's an error and it's related to our systems, then the system has to sort of fix it. You can't blame the person.

So, one of the things that we look for are systemic issues or systemic failures that can cause human failure, but human failure usually doesn't cause systemic failures.

So in all of the systems that we use for quality and safety, our biggest thing is our event reporters and the ability for us to speak up when we think there's a problem or to speak up when we see a problem or to speak up when a problem occurs so that we can find out if it's something related to our systems and fix it. So that we can improve patient safety and patient quality.

This is totally against what is called by Sidney Dekker, a just culture. Our just culture is sort of saying, everybody is human and humans can make mistakes. So, we need to put in systems in place that don't allow us to make a mistake. Things where, if you want to use a syringe for something, but the syringe won't fit, you're like, "Why won't the syringe fit?" because you're not supposed to use that type of syringe. So, we didn't make it so that it would fit.

Those types of things are blocks to things that could create problems and stuff. So, this conviction's kind of controversial, in the sense that it's going to cause an issue of people not wanting to speak up or maybe making a mistake, but maybe saying, "Well, it didn't hurt anybody. So, I'm not going to say anything because I might get in trouble."

Or did hurt somebody, but it didn't hurt them so much that, I don't want to get in trouble and not speak up.

It will stop people from speaking up. So that, even things that might be tremors, that we could notice early, now could become earthquakes because we're not really recognizing them. So, we're really worried about this, in terms of having a shock wave of effect through nursing and through all people who do reporting.

We get reporting from everybody. We get them from the environmental service people. We get them from the cafeteria people. We get them from anywhere where patients come in contact with us and stuff. So, frontline staff really need to speak up and let us know what's going on.

If there is water on the floor, that's a safety hazard for everybody, not just the patients, but for us. So, someone needs to say, “Hey, there's water on the floor. I'm going to put something over it," or, "I'm going to dry it up," or, "I'm going to call..."

If it's a really big thing, well, where did that water come from? Is there a leak upstairs?

There are all these different things that we do on a regular basis, that are for safety purposes. So a person looking at a thing and saying vecuronium versus versed, they saw the V and maybe their vision was poor that day or they were wearing their goggles and their mask and their glasses fogged up. I mean, this is what's happening to all of us, in COVID.

You're going, that looked like versed. Yeah, I think it was versed.

What happens is, that we do thousands of things correct, every day. But if we do 3 or 4 errors, those could hurt a patient.

So the idea is that, when we do have an error or we do have something, we want to be able to find it and correct it or find it and find the systemic problem and correct it, so that doesn't happen again.

That's really what this case is all about. It's basically saying this person made a mistake. Just because they made a mistake, doesn't mean that they should be criminalized for it. It wasn't made intentionally. Do you know what I mean? There was no intent to hurt the patient. It was a mistake between 2 medications.

Yes, it's horrible. Yes, we all wish it didn't happen, but it happens thousands of times, hundreds of thousands of times in some situations, every year, in all hospitals when you add them up.

So, if we criminalize it, then people won't speak about them and probably more will happen. That's the problem, and that's what we learned.

Long ago, when we used to convict against these things, if you convict people, it just makes people hide it more. It's not going to make people talk about it more. So, I think that's why this conviction's really pretty controversial.

It's going to make it really hard for us now, to say, “Hey, what's the next step here?” If she really is convicted and she goes to jail and someone sees this as a precedent and they want to sort of say, "Hey, so-and-so got put..."

When I was working in New York, we had a situation where an oxygen canister, which is supposed to be non-metal, went into an MRI area by accident. They thought it was something that was non-metal.

When you turn on an MRI machine, it's a really strong magnet. It pulls in the canister of oxygen if it's made of metal. It'll pull it directly where the magnet is strongest, which is over the patient.

This happened, and it crushed the skull of the child and the child died. So obviously that was sort of like, “Oh my God.” Right? Everybody was like, “What are we going to do?”

Transparently speaking, you have to say, we made a big mistake. We brought a canister of oxygen in there, that shouldn't have been there.

And then we went through the whole process. You do a root-cause analysis. And you say, well, that canister is the same color as the one that is non-metal. So, maybe we should have made them different colors. And should we have not made it so that we need a metal detector when you get into the MRI area, so that it would go off if anything metal comes in there and stuff?

All of these processes, now, are all standard. Those are all standard things that we do, but they weren't before. Everybody just assumed, hey, don't bring anything metal in here, and we'll be okay.

But now if you go into an MRI, it's like, they frisk you down, make sure you don't have anything metal on you and stuff like that. And then you have to come in with specific things.

So, it's a really process-oriented thing because of the fact that many people were injured in MRIs, prior to that and stuff.

So, it is one of these things where we want to make sure that we have that correct for people.

Jessica Bard: Mm-hmm (affirmative). I think those are all good points, Dr Harrington.

How does this conviction and precedent, complicate efforts to recruit and retain people in health care professions? You mentioned in your last answer, it's sending shock waves through nursing. What did you mean by that?

Dr John Harrington: During COVID everybody has been working in a stressed environment. Right? So especially for adults, we had so many patients pass away. And then we've all had to wear extra things, masks all the time, goggles all the time.

Lots of patients, lots of patients being held for longer periods of time, staffing shortages, working longer hours and stuff.

Having sort of your administration saying, "We're with you. We love ya. Keep working hard. We're going to get the staffing back up."

But now, if someone says, well, if you make a mistake though, you're in trouble, and so that can cause basically... “Listen, I was almost ready to get out of this thing, so now I'm out. Now I can get in trouble if I make a mistake.”

I'm so tired sometimes, and I feel like I'm going to make a mistake every day now. So, forget it. I'm not doing it anymore. They just throw their hands up and say, "I'm retired," or, "I'm going to do private duty nurse, where I just take care of one person, and I can focus on one thing and blah, blah, blah."

They leave nursing or the hospitals’ type of nursing. So, we've seen a huge exodus of nurses.

So, everybody is having issues maintaining nursing staff, especially frontline workers, especially people working in high-stress environments, hospital beds, ICUs, and things like that.

Or they're taking jobs that are more administrative. It's not necessarily that they left the system, but they moved out of the frontline system because they don't want to be exposed to that type of difficult environment, where you have to make decisions all the time.

You're working with stuff that could harm a patient. You're doing it right now, in a very safe environment, but still, there are errors every day.

There are still problems every day. You might get that someone put the wrong weight of a patient, so you get the wrong dosage of a medicine. Someone's like, "Hey, the patient's 13 kilograms or 13 pounds."

Someone put pounds, and they should have put kilograms, or they should have put five kilograms or six kilograms, but they put 13 because they said pounds. So, there are these types of things.

But we put that in the event reporter and we say, our computer should have picked that up. Why didn't it pick it up? It was put in the wrong screen or put on the wrong area.

These types of things happen all the time. But if you're going to take the fall for it, then you're like, "I'm giving up. I'm not going to do that."

That's why this is such an important thing to play out and say, "Listen, if we do this and we basically say we can't protect you from the criminalization of it," then we're going to have a hard time keeping people. Because we know that those accidents or those errors are going to occur somewhere in the system. We don't want them to occur, but we do know that they occur.

So, for most event reporters, even if I look at our own event reporter, we get 400 events per month that we report on or 300 events, 200, however many there are.

So we look at those and they're all from different aspects. It could be from medication management. It could be from the patient flow. It could be from operational issues. It could be even things that we use, instruments that we use, and stuff like that. Hey, this instrument's bent. How come it's bent? Why did it get bent? It wouldn't work well. So, we need to let somebody know what happened and stuff.

There are little issues that come up, that we don't know about. So, the reason why it would be hard, is that people would say, "Hey, if I'm not protected and it's not really my fault," or, "I did make a mistake, but it was a human error mistake," but it was easy to not tell the difference between versed and vecuronium because they both begin with V and they both kind of sound veree, but how else can we make them different?

Should they be in a different colored vial? Should they be in a different type of thing?

It seems like they were trying to do those things and they probably did fix those things. But at the same time, people are working fast, if they're short-staffed, if they're tired.

Sometimes people do workarounds and stuff, because they're like, well, we still have to get the work done. There are still 10 patients to see. We've got to figure this out.

It always comes in the wrong vial. Just switch it to the other vial. It'll be fine. Don't worry about it.

You get this kind of workarounds and stuff that people do because there's just not enough time in the day. Right? So, people kind of make changes and stuff.

We really don't want them to do that. We want them to report it to us. But if you say, well, if I report it and something bad happens or whatever, then that's going to be a problem.

So if a bad thing happens, then everybody's like, "I don't want to say anything," that's going to be a problem as well.

We really need them to be feeling that everybody's behind you. There are shared accountability, from both the physician, the nurse, the hospital and all systems, so that everybody has shared accountability.

If someone sees me about to do something wrong, I want someone to speak up, "Hey, Dr. Harrington, you probably don't want to lean on the person's trach. It might pop out. You probably don't want to do this. Are you sure that's the right dose? Because the other kid is the same size and you gave half that dose. So, why are you giving double the dose to this patient?" "Oh, really? What? Oh my God, you're right."

So it's one of these things where you want people to speak up and be able to say, hey, and speak up through levels.

If a medical student says something to me, I'm not going to say, "Well, you're a medical student. What do you know?" I'm going to be like, "Wait a second. You might know what you're talking about. So, I should listen to you."

That whole idea about psychological safety is to be able to speak up, even if you think the person is quite more senior to you and stuff. Those are the things that we really don't want to not happen. We want that to happen and stuff. So, those are really important aspects.

Jessica Bard: From a clinician's perspective, what are more effective mechanisms for examining errors, establishing system improvements, and taking corrective action?

Dr John Harrington: A lot of times something happens, and so people assume what the answer is. Well, that wouldn't have happened if we did the following things.

But in reality, a lot of times, you really need to know what the problem is. Some people will call that the root-cause analysis.

You really need to interview people and kind of dig down and say, what is the real reason that this happened?

If a patient got the wrong dosage of something and you were saying, "Well, it's just because they didn't measure it correctly," well, if the pharmacy was delivered new vials that were diluted 1 to a 100, vs 1 to 1000, and therefore they need to give a higher dose because it's 10 times less because we're having supply chain issues.

They couldn't get the 1 to 1000 dilution. So, they only had the 1 to a 100. So now we have an increased amount.

What happens is, the person gets it and goes, well, that's too much. But in reality, it's not, because we have a new thing, but that dilution thing didn't get moved down the chain. So, the nursing didn't know it. The doctor didn't know it.

The reversal might happen. You might get a much more concentrated solution. Now you're giving them a more concentrated thing. You're saying, "Hey, listen. We usually give 5 MLS. Why are we giving 0.5? I think they're wrong." And then you draw off the 5, and now you just 10 times dosed the patient.

So these are the types of things that I'm talking about. It lets you understand that you need to find the root cause analysis of why this happened.

So, you have to ask questions and then find out, hey, the supply chain has problems. It wasn't the fact that you got the wrong syringe or the wrong dosage. We got a whole new, different standard dilution that we're doing here in the pharmacy. That didn't get translated down to the front lines.

Now we know why it happened. Now we can fix the problem, versus if we just said, well, they just keep giving us the wrong dosage and stuff. So, we're just going to keep changing it, changing it out and stuff.

A lot of times, knowing how to kind of move through that system and asking all the people that are involved in the process, is the only way to find out the actual reason to the problem and then how to fix the problem.

So a lot of people sometimes see the end product and they say, well, we should just fire that person because they don't know what dosages they're giving. So then, you never fix the problem with the pharmacist, and then the next person comes along the same problem.

They do the same exact thing that the person that you just fired or you just did a criminal act to and said, "You need to go to jail."

But it's actually the pharmacist that had a different dilution because of a supply chain problem that was related to our war in Ukraine. So, you can see how it kind of moves back to the whole thing.

There's another sort of way, ask 5 questions: why. Well, why did that happen? Well, because the dilution is different.

Well, why did that happen? Well, because there's a supply chain problem and we don't have the right dilution.

Well, why did that happen? Well, we have a supply chain problem because there are so many container ships standing out in the thing, that they can't deliver these products to us.

Well, why are there so many container problems? Well, there was a problem in the Red Sea, where it was blocked and stuff, like when we had those issues with all the barges out there.

So there is a reason why, but the final common denominator, is that we gave the wrong dilution. And the person on the front line, that actually gave it, didn't know what the dilution was and stuff.

So when you finally dig into the whole process, you find out why it happened. It wasn't the person at all, who you sent to jail. It was actually the supply chain issue. Now we can actually fix the problem versus send the person to jail.

That's really where this goes back to. It's sort of like, was it really a mistake or an error, or was there something wrong that we could have fixed in the front end and stuff or the back end, however you want to look at it.

Jessica Bard: What do this conviction and precedent mean for patients? Conversely, what would use the mechanisms that you just mentioned, mean for patients?

Dr. John Harrington: When you first look at its sort of like, okay, we got rid of a bad actor. We got rid of a bad person, who was dangerous to patients. Okay?

In reality, you've probably made it much less safe for patients because you've now switched the conversation of, people are the problem and we need to just get rid of the people that are the problem vs it's a system issue, and this happened for 5 years ago. So probably, a lot of those systems have hopefully been fixed.

But you point out the fact that if you just get rid of the people, you're not getting rid of the problem, just as I was explaining. The problem may still exist there, but if you get rid of the people, that doesn't necessarily solve your problem. So, it's likely not safer for the patients to have this conviction.

It's probably much less safe for the patients, to have this type of conviction. It may feel vindictive or vindicated, the fact that we did this and we fixed it, but you really didn't. You probably made it much worse.

So many times, whenever we try to use the law or we try to use the judicial process for these things, it doesn't work. It just makes it worse.

People won't speak up. You can show things related to the Challenger problems and stuff. People weren't willing to speak up, because they were told, "Don't say anything. Things are going fine. I don't care about the O-rings. Those are fine. Don't worry about those and stuff."

They were sort of like, "Well, we want to speak up, but we're being told that we shouldn't say anything."

If you read some of the literature related to these things, it was all related to a culture of safety and just culture. Just saying you should be able to speak up, without the fear of being criminalized or losing your job or something bad happening to you. Otherwise, bad things happen to other people. That's really the main issue here, is that you have to be able to speak freely, related to your job, in order to maintain safety and quality in the things that you do. Otherwise, you're just doing the motions and things will just get worse.

That's what's really kind of hard about this, is that the person likely is going to go to jail and none of this is probably going to be helpful to anybody, other than making it worse maybe, for patients.

Jessica Bard: You mentioned a couple of anecdotes but give us more of your personal experience. What are you hearing from your peers in the healthcare community about this conversation?

On the nursing lines, they're basically saying, "I'm done. That's it. If this is really the way things are going to happen, I'm not in interested in staying in this field."

Dr John Harrington: From another side though, what we're hearing is that a lot of administrations and a lot of quality and safety people like myself are saying, "Listen, we've come so far, in terms of trying to make a just culture and a safety culture. We've made a lot of headway. We don't want this to set us back 25, 30 years. We want to keep moving forward."

So most of the chatter has been, that our goal is not to punish the individual who makes a mistake. Our goal is to analyze everything that creates an error and build the systems, so that it won't happen again.

For us to do that, we have to be a full team. We can't allow other people to decide those actions and stuff like that. We can't criminalize errors. If they're human errors that were non-intentional, we can't criminalize that.

We can audit things and see if people are making mistakes. Then, we need to help them. If they're unintentional mistakes, that are based on our system problems, then we need to fix those because they're just going to keep happening. There's no way around that.

So, the real chatter right now, is really saying, listen, we need to man up a little bit more and say, "Listen, we're not going to stand for this. This is not something that we're going to let go of quietly. We're not going to say, okay, well, they can convict her and stuff like..."

I mean, most of the chatter is basically, we support you. We're with you. We're not going away. We're not going to try and hide from this.

We are going to go straight out there and say, "Listen, for safety, you need to be able to report things. You need to be able to say your errors. You need to be able to feel safe, that that's not going to land you in jail for doing something that you felt was unintentional, and that you were following a protocol."

Jessica Bard: Is there anything else that you'd like to add today, Dr. Harrington?

Dr John Harrington: Well, I think there's a lot of people that know a lot more about this than I do. The thing about it is is that the passion that a lot of people have for safety and quality is really important.

The idea of just culture and psychological safety is real. You really have to cultivate that. You can't just do that overnight. People have to feel like it's an important aspect, and it has to be woven into the fabric of your culture.

This is almost like someone taking a razor and trying to cut it in half and stuff. We really are like, whoa, that was just coming out of nowhere. That just really shouldn't have happened.

So now, we really need to say, listen, that was an anomaly, and it really should be treated as an anomaly. It should not be the normal phase.

We need to move forward and say, “listen, a culture of safety is really important. A just culture is really important. A psychologically safe culture is important. We need to cultivate all of those, within our fields. We will see improvements on safety and quality.”

So it is one of these things where I think it's an outlier. We're hopeful that this is not going to become a common aspect, but I do think that we need to speak with one voice.

I do think that if some other types of communities want to start doing this, then we're going to need to really group together and have the AAP, all of the different centers and, the nursings, the physicians, the colleagues, and medical colleagues.

We're going to need to speak up as one voice and say, this is not going to be the norm.

Jessica Bard: Well, thank you so much for your time today. We really appreciate having you on the podcast.

Dr John Harrington: Yeah. Thanks, Jessica. Thanks for giving me a chance to speak out