Girl Doc Survival Guide
Young doctors are increasingly in ‘survival’ mode.
Far from flourishing, the relentless pressure of working in medicine means that ‘balance’ is harder than ever to achieve.
On the Girl Doc Survival Guide, Yale professor and dermatologist Dr Christine J Ko sits down with doctors, psychologists and mental health experts to dig into the real challenges and rewards of life in medicine.
From dealing with daily stressors and burnout to designing a career that doesn’t sacrifice your personal life, this podcast is all about giving you the tools to not just survive...
But to be present in the journey.
Girl Doc Survival Guide
EP140: When We Do Harm: A Discussion on Medical Mistakes with Dr. Danielle Ofri
In this episode, Dr. Danielle Ofri, a renowned physician and author with extensive experience at Bellevue Hospital and NYU, delves into the critical issue of medical mistakes discussed in her latest book, When We Do Harm. She provides insights into her journey of understanding the prevalence of medical errors, the emotional toll on healthcare providers, and the importance of balancing professional duties with emotional well-being. Dr. Ofri also shares personal experiences and advocates for systemic changes and honest communication to improve patient safety and care.
00:00 Introduction and Guest Introduction
01:00 Discussing 'When We Do Harm'
01:53 Understanding Medical Errors
03:50 The Human Element in Medicine
07:08 Personal Experiences with Medical Errors
11:35 Emotional Impact and Coping
17:43 Strategies for Improvement
28:29 Final Thoughts and Recommendations
Christine Ko: [00:00:00] Welcome to today's episode. I'm thrilled to have Dr. Danielle Ofri with us. Dr. Ofri is a physician and author who has spent much of her career at Bellevue Hospital, where she continues to practice and teach. She holds both an M. D. and a Ph. D. from NYU and is the Co founder and Editor in Chief of the Bellevue Literary Review. Dr. Ofri's writing, featured in the New York Times, The New Yorker, and the Lancet has gained wide acclaim for exploring the emotional complexities of being a doctor. Her latest book, When We Do Harm, dives into the sensitive and critical issue of medical mistakes, advocating for better systems to minimize harm while recognizing the emotional toll these errors take on both doctors and patients. She's here today to share her insights on how doctors can balance their emotions and relationships while continuing to deliver excellent outcomes for patients. Danielle, how are you?
Danielle Ofri: Great. It's really great to be [00:01:00] here.
Christine Ko: So I'll start off with your latest book, When We Do Harm. How did you come to write that book?
Danielle Ofri: A couple of years ago, my editor at Beacon Press sent me an email with a Boston Globe article attached to it, which said, medical error is the third leading cause of death.
Is this really true? She asked me. And when you're the only physician in a publishing house full of English majors, you get everyone's medical questions. And so I'm thinking, you know, I don't actually know if that's true or not. Here I am, I'm a primary care doc in a huge city hospital. And if medical error were indeed the third leading cause of death, I should be seeing lots of it, right? As much as I'm seeing heart disease and cancer, which are the first and second leading causes of death. And I wasn't. Or at least, I didn't feel like I was and either it's there and we've complete blinders on or the data are completely wrong. So the book came out of this effort to try and figure out, is that true or not?
Christine Ko: Is it true?
Danielle Ofri: So if I boil down the book, we can't really know, partly because simple things [00:02:00] like defining a medical error, it's harder than you think. When you give someone the wrong medication, was it a complete error? Was it a judgment call? Was it a dosing issue? A titration issue? Someone amputates the wrong leg. Okay. That's easy. That's an error, but rarely are things that clear cut. And even more difficult is, how do you know an error causes a death? Again, sounds obvious, but most poor medical outcomes occur in patients who are sick. Sick patients have many illnesses, many medications, lots of people on the teams. And so figuring out what thing led to the death is very hard. Imagine a patient with end stage liver disease given the wrong medication. And they die. Did the error cause the death? Patients with cirrhosis end stage are dying anyway. And so we don't know if it causes the death. So figuring that out precisely is close to impossible. I think we'll never know. It's probably lower than number three, but it's not zero. It's not something small.
What I really learned from the book [00:03:00] is that the idea of error is a little less important than avoidable harm. If you're the patient and you end up on dialysis because of a poor medication choice, if it was an error by the nurse or it wasn't error, you're still having the bad outcome. From the patient's perspective, anything that harms them, an error or just a bad outcome, it doesn't matter for the patient. Still bad for the patient. So looking at medical error in the broader lens of avoidable harm is a more productive way to look at it. It gets pulled into this idea of patient safety.
Christine Ko: Maybe it's not number three in terms of cause of death for patients. I feel like, anecdotally, when I talk to people around me, it just seems like almost everyone has some degree of harm from seeing the medical profession.
Danielle Ofri: That really underlines the idea that medicine is an inherently human endeavor, and we're often told, well, let's take a page from the airline industry, which has [00:04:00] these great checklists. Airplanes, complex as they are, they come in a finite number of models. You can make a spreadsheet big enough to list every airplane model and every piece that's in there. And you can cross check that. It'll be a big list, but it is finite. With humans, it's not quite that way. People, their illnesses, present in multiple different ways, depending on their genetics, their physiology, their habits, their environment, socioeconomic class, or culture, their language issues, economic issues, all kinds of things come into play. And you cannot make a checklist big enough. And then of course, there is the interpreter, the physician or nurse who has their own set of history biases, recent incidents that loom larger. And so there's many dimensions to human illness that we can't easily checklist. One of the other flip sides of a human endeavor is that there will be error or poor outcomes. And what about all the things that happened, but the patient wasn't harmed? Those near misses, that's the learning curve for the future. That's preventing the next [00:05:00] patient from being harmed because my near miss today is an error tomorrow that's going to harm someone. Yeah. And again, why do we not talk about near misses? Mainly because of emotions and shame and culture. And if doctors and nurses won't talk about the near misses, then we don't have good data. And so we're allocating our resources poorly because we don't know where the potential harms are happening.
Christine Ko: Yes, exactly.
Danielle Ofri: One way to look at an error is, what was the environment such that the error was possible? One of my favorite examples is the tubes and hoses in the OR, right? The anesthesiologist has all these different gases running in, oxygen, nitrogen, anesthetics, and, you don't want to mix those things up. And of course, occasionally some got mixed up, and patients would die. So then they color coded the different tubes, so it'd be easier, but there are still occasional errors. They finally changed the size of the nozzle and the hose so that you couldn't mix them up, right? So you can make it less likely to happen. That's the systems engineering. We have to be [00:06:00] accountable and honest and interrogate ourselves, but there's a systems issue that makes it more or less likely. Sound alike medications, look alike packaging, all kinds of things, patient names that sound alike, just lots of things.
So knowing that, that error is always going to happen to some degree, stopping yourself and catching it, that's error mitigation. And that's a more realistic path than getting rid of all error, that we can learn to mitigate. My analogy I sometimes make is I'm a cello student. Maybe you can see my music stand in the background. And so one thing, I grew up playing piano where an F sharp is always in the same spot because it's marked by the white and black keys, but on a stringed instrument, there's no markings. And so finding the right note is really important. So you're constantly trying to fix your tone. And someone interviewed Yo Yo Ma, and they said, how do you play so in tune? He said, I play as out of tune as anyone else, but I hear it faster and I fix it before the audience hears it. So like error mitigation, right? [00:07:00] We can make an error in our thought process or what we're choosing. But if we can stop it or prevent it from causing harm, that's a worthwhile place to go.
Christine Ko: Would you be willing to talk about some difficult moments in your own career involving medical error or near misses?
Danielle Ofri: How much time you got? I think for people who are not in medicine, it is really terrifying to know that every doctor and nurse walking around has, I always think of it as, like rosary beads. I'm not Catholic, but I imagine we all carry this like thing in our pocket of each one of those errors because they so hit us in the soul because everyone wants to do a good job. When we find ourselves falling short, we are traumatized and so ashamed of what we've done. We never forget. And you can find the oldest emeritus professor and they can tell you with a clarity that's frightening about the error they made when they were a medical student 60 years ago. It never leaves. Yeah. And so when I start a new month on the wards, the new team, I always talk about here's the, my top five errors, that I've made just because we all make them. But one that [00:08:00] really stands out to me was when I was a PGY2, a second year resident and I had an intern. I think we were on the HIV service. It was just some busy night, no caps on admissions. And so the only way to gain control of your night was to try and turf your admissions to other services. So if you can get an admission, can you like transfer that patient to surgery or OBGYN, get them off your list? And so we had one such patient admitted for altered mental status. I got the report, patient's totally fine, totally stable, labs fine, radiology fine; perfect patient to turf. And so at that time at Bellevue, maybe you recall, we had this back ward called IMCU, the intermediate care unit, and you could transfer a patient there that didn't need active medical issue. Perfect IMCU case. So I called the IMCU doc: patient's totally stable. And I pressured this poor doc at five minutes to five to take this patient, which he did. My intern and I, we high fived each other. The next morning, I learned that my totally stable patient was actually hemorrhaging into their brain. And that is why their mental status was [00:09:00] altered. But I missed it. And I missed it because I didn't look at the CAT scan myself. Someone said, radiology fine. And I took that. And of course, I knew I should have checked it. I just, I was busy and someone said it was fine. And so I just took that. Now the patient did okay because someone else saw the bleed, and called neurosurgery directly. The patient was whisked right from IMCU to the OR, and the patient did not suffer any harm. So in fact, it was a near miss that was captured by a system with layers of supervision and oversight. So in some ways it was a perfect outcome, but I had still made the error. I had neglected to do my due diligence as I knew I should have.
So near miss just means the patient was lucky. They were lucky that circumstances allow the error to be caught. But in another circumstance, I could have caused a patient's death. And even though it didn't, I was mortified. I was so ashamed because I knew that I had fallen short and that what I had done had put [00:10:00] my patient's life at risk.
And of course, I didn't tell anyone. I didn't tell my intern. I didn't tell my attending, and I sure as hell didn't tell the patient. I could not have imagined a sorrier fate than dragging myself to the patient's bedside to their family explaining how I almost killed the patient, but they were okay.
I didn't say nothing and that really reflects the atmosphere that we have, that we would rather not say anything. And of course, beside the lost opportunity of preventing that kind of error in the future, it was a lost educational opportunity. Imagine. Think of all the things you've forgotten your attendings told you, right?
You could fill volumes of the 20 kinds of vasculitis. But if an attending took you to the bedside and modeled how to talk to a family about a bad outcome or near miss, that's something we would remember. That's a lesson you can't teach on PowerPoint. But that was missed, and that error was never counted, right?
Remained forever hidden, uncounted, unstudied, [00:11:00] unremedied. So when we do our statistics of how to prevent patient harm, that error was completely missed. So there's so many ramifications for the fact that we don't talk about our near misses, and we don't talk about them. Obviously there's a fear of lawsuits and that's a whole separate issue.
But superseding that by leaps and bounds is our own emotional state, our embarrassment, our shame, our humiliation of falling short. And that's why we don't talk because it's crushing because we're not the doctor we thought we were. Up until that moment, I thought I was pretty good. And after that I realized I was just a failure and I should really drop out as soon as possible.
Christine Ko: Yeah. When and how were you able to start talking about it?
Danielle Ofri: Oh, it took a good 20 years. I had to become a senior attending and be willing to write about it. That's how long it took. And the other ramification, which dawned on me later, is the emotional shock. I was just in a fog after that. My brain, my soul. So how many errors did I commit in the weeks that followed? I'm [00:12:00] sure I missed many things. So many other patients probably had less than adequate care because of that one thing. So it really just redounds for so long for so many years. It took that long because in movies and cable shows, we see the bad apple doctor. There are some. But I really think that it's a very tiny minority. People who go into medicine, into health care, by and large are going because they care about their patients. People genuinely want to help their patients. And so when we recognize that we have harmed them or done poor duty to them, we are crushed about that. And there are stories of suicides in their wake. A NICU nurse who made a tenfold error in calculation in calcium chloride. Which as you know would be fatal and this was by all accounts a 25 year veteran nurse. In that year that followed she took her own life.
Now we don't know what else was going on, but I think we can all identify with this complete obliteration of sense of self that we have failed or killed or [00:13:00] harmed someone in our care who is vulnerable. That's a tough one to recover from.
Christine Ko: It's telling that you said when I asked, when you were able to talk about it, you said once you were a senior attending because once your identity as a senior attending is solid then you can reflect, and you ask these questions, and you feel stable in who you are, your identity, and that you do, you are a good doctor, you do a good job, you're human and you make these errors, and everyone does.
Danielle Ofri: Isn't it a shame it took 20 years? Peter Pronovost, who many people call one of the fathers of patient safety, said that, hierarchy kills. That there's an untold number of patients who've died just because of hierarchy. The people lowest to the ground are the ones who probably see the things that are going on, unsafe happenings, and do not feel comfortable talking up because the system does not allow it. I think we've improved since then. I strongly believe that, but it's still, hard when you're at the bottom to speak up.
Christine Ko: Yeah, absolutely. I think also you [00:14:00] were touching on in some of your earlier comments on emotions and how shame, especially, we're ashamed that we're not living up to our own or whatever set expectation that we think there is on what a good doctor is like. Can you talk a little bit about that?
Danielle Ofri: It's just it's so profound, and I think there's an important distinction between guilt and shame. Guilt is about the action that happened and shame is about ourselves. Guilt prods us to make amends, but shame makes you want to hide. And those are two very different things. I think doctors in particular have difficulty distinguishing between guilt and shame. You know, being aware of that distinction requires someone to model that for you. And so when your senior resident or you're attending or someone higher than you talks about how they experienced an error, what it felt like, it's really important to see that. It's really an important thing to recognize that you can make an error, but you are not the [00:15:00] error. Those are two different things. Are there some people who are the error? Yeah. There's always a few people who deliberately lie and cheat. There are a few out there and they should be prosecuted, and gotten rid of, but 99 percent of the people who make an error, it's either a slip of judgment, being overworked, or genuine lack of knowledge. It's something that wasn't a deliberate thing. Not malicious.
Those who are malevolent don't have the capacity for shame as we see in our world today. But when you care, that is when you have a capacity to feel shame. And of course, why do doctors have the highest rates of suicide and drug use and substance use compared to other common professions? I think part of it is because we take our charge so seriously. We genuinely care. We take an oath, and it defines who you are. It's who you are, not just what you do. And ,so when you fall short, it's cataclysmic, it's like the mirror just shatters and the person you thought you were is not the case any longer. Or so we think.
Christine Ko: Yeah. Yeah. I was talking to Pat Croskerry a long time ago. He's written a lot on [00:16:00] cognitive processing related to error and patient safety. And he related a story about a fellow colleague who made an error in an emergency situation on a child, and the child ended up dying. So it was a serious medical error, and the physician took it really hard and attempted suicide. And Pat he put it in an interesting way, but I think it actually goes along with what we've been talking about. He said, well, when we make an error like that, we should be hard on ourselves. But at the same time, we shouldn't be so hard on ourselves. When I sometimes talk about things I've done wrong, whether at home or at work, and I'm telling a family member or a friend or confidant or something, a lot of times they're like, don't be so hard on yourself. You could say I'm being hard on myself, but I'm processing the appropriate guilt that I should feel at doing X, Y, or Z wrong, especially it's very serious, but I don't actually feel shame. I'm not trying to hide from it, and I'm actually willing, I want, to talk [00:17:00] about this with you, and I don't want it just dismissed in the sense of, Oh, you're being so hard on yourself and stop being hard on yourself, without thinking about the appropriate amount of guilt that I should use then to be better or different next time.
Danielle Ofri: I, I wholly, I wholly agree. And so I think, being honest about our emotions, it's not so easy. We talk about, Oh, just be open, be honest, but it's really not that easy. We have a lot of pride in our work, and reputation means a lot. Your word is really the only thing you have in this business, that people trust that what you say, you mean it. When you say you'll do X, that you do it. And that, you know, we should be sort of lauding people who come forward, honestly. But, you don't get the faculty of the year award because you came forward with your error.
Christine Ko: Yeah. So do you have any, in your research or just your own experience, do you have practical strategies that you think are useful in terms of being able to be honest with your emotions? Because I agree, [00:18:00] I find it very difficult.
Danielle Ofri: It has to start from the top. So imagine how you'd feel if your department chair starts the meeting with, I want to talk to you about the errors that I made. Or whatever it is that I, where I fell short, here's what happened. Here's how I felt, here's how I handled it. Here's how I handled it poorly. Here's why I got a little better the next time. If nothing else, if the people at the top talk more honestly, you say, Oh, that's what people at the top do. That's what chairs do. That's what medical directors do. So at least it says this is okay to do. Talking about where we fall short, how we can do better becomes the comfortable thing.
I have friends who trained in psychotherapy and they always have these supervision sessions and they go on for years and years. Like even when you're out and an established clinician, people maintain supervisors or or interprofessional groups where they meet, once a month or once every two months to discuss difficult cases just on their own. There's no legal anything or no medical CME, but they just do it as a matter of course.
We don't really have that. That kind of thing would help us because we can [00:19:00] talk in a place where we're not being judged. It's not a conference. It's not a faculty meeting. It's trying to be honest with ourselves, Did I make an error or am I the error? How do I distinguish those things? And finding a trusted confidant, maybe they're not in medicine, or maybe they are, and they can help understand to help you see the difference between those things and to think about the circumstances that made this error more likely, even if, yes, it was me who pushed the wrong medication, but were there things that could have made that easier? I just was printing off some letters. I was rushing talking to my husband and I printed off the wrong batch. I wrote the same set of letters twice, not a bad out come, but I wasted a lot of time. You can't actually multitask. You always make a mistake. Next time I'm not going to do that. That same kind of approach. Where's the part I can have control over? And some things I can't. But that's also a place to alert our institutions.
People often ask, how do you get change accomplished in an institution? Very hard. But I think there are two things that institutions care [00:20:00] a lot about. And one is, um, money and getting sued, and the other is patient opinions, right? Because patient satisfaction surveys now factor into money. And so, they care what patients write and their experience. So using those two frameworks, I've actually filed patient safety complaints on near misses. If a patient had a bad experience because I was rushed in clinic, because they doubled the number of patients, something like that. I'll write that this isn't your mess. I didn't make a great error, but I really was at risk because these 15 different mandates you threw in the middle of my visit made it a bad experience for the patient and put them at risk. That kind of thing. Here's where the newest rollout of Epic is now so much more complicated in this area that it's easy to miss check a box. It didn't happen, but I almost did. And that could have been a bad outcome or a lawsuit. So, patient safety and patient satisfaction are languages that our higher ups think about. And I will often file patient safety complaints because someone has to follow up on [00:21:00] that. It's annoying. It's hard for me to do. It's a pain in the ass for them, but it puts it on the record. I almost clicked the wrong box here because of this terrible interface.
Christine Ko: That's really, that's very smart. That's really good advice. Do you have any other advice for younger doctors who are starting in their careers, on maintaining emotional balance and resilience in this culture that I still think does still demand perfection?
Danielle Ofri: Every time I hear the term wellness, I get a little bit nauseous, you know, because it implies that you just need to do a little better job. So whenever they say, Oh, you know, we're going to have a Pilates session.... I don't want Pilates. I want an assistant. Like I, that, that is what I need. Don't tell me to take yoga. Yoga is great. Pilates is great, but give me enough time to do my abnormal labs or, give me an assistant or something like that. Like the practical things. So the wellness and resilience thing, yes, you want to be resilient. It's like taking the onus is on us to make ourselves work [00:22:00] in this terrible system. Which is why I do advocate for voicing those complaints upstairs. And don't be afraid to talk about that when our 40 minute slots for new patients were cut to 20 minutes. That's problematic.
Independent of that, I do think that having our own hobbies and passions is really important, particularly when you hit mid career. Once you're out in practice, 10 years in. I did not recognize a source of my feeling a little out of sorts, till I ended up taking cello lessons. And when I suddenly was in that learning mode again, where I was with a teacher and being pushed to climb harder and harder, I realized I was missing that in medicine. Even though internship and residency, medical school, those were tough times. They're also exhilarating times. In the same, opposite side of the coin that they were hard and crushing, but you also, you were just learning and your mind was expanding and that, that's just an incredible experience.
Having a passion where you're learning and being [00:23:00] pushed to move and grow because I do think it opens our brains up for the more sophisticated ways of thinking. And I think a lot of our medical errors, if I can circle back to that, are the fact that we often think in these concrete lines. Okay, this is how we diagnose X, Badoom, and we do that, and it's easy to make mistakes when we do that. But, forcing yourself to learn something else is giving you more neurological resiliency.
What's a metaphor in poetry? A metaphor in poetry is taking two unrelated things and finding a connection. That is a process in our brains. If you try to do a new kind of exercise, there's muscles you haven't used before and they gotta get worked out. Forcing your brain to think across disciplines and think in different lanes. Humanities or music or anything else that we learn, it's not just good for ourselves. It's actually good for our thinking process because it allows you to think in these more subtle, sophisticated ways and also help you work with ambiguity. We don't talk much about uncertainty and ambiguity in medicine because we don't like it very much. We want, oh, let's have a nice [00:24:00] little algorithm, but so much of medicine and patients are ambiguous because they're human, and there's emotions involved. And so tolerating ambiguity and functioning within an ambiguous situation, which is often what we have to do, it's profoundly unsettling. But things that specialize in ambiguity, I'm thinking humanities, literature, music, all these complex art forms, that's what they revel in. And so it gets you more accustomed to that. And I think helps us be better in the ambiguous situations that medicine brings up.
Christine Ko: Yeah, I agree to learn something new as you did with cello does give you an agility and flexibility. Because to circle back to what you were saying earlier on about checklists and aviation, every patient and every doctor are not going to be the same on any given day. So even though we might set up these algorithms or checklists on how to approach a certain problem, the same approach isn't always going to be the right approach. So I think you're right to have a way to [00:25:00] link across and just maybe come to a different solution for a particular patient is important.
Danielle Ofri: And also, I mean, there's even a step beyond being a better diagnostician and medical knowledge. We often talk about smart doctors and, listen, everyone's smart, but wise doctors are different. You learn that difference as a patient, when you get very sick, you can have smart doctors, make these diagnoses, but then what do you do about them? How do you approach them? That's where wisdom comes into play. And if you've ever had to navigate serious illness of a family member, you really see the difference that it makes when you have doctors and nurses with wisdom. And that wisdom is just very different than being smart because you can make the diagnosis, but how to navigate a treatment plan, how to think about mortality, how to weigh risks and benefits because weighing risks and benefits is not a Bayesian theorem. Patient's feelings and what matters to them. Sometimes they don't know how to weigh them. [00:26:00] That's really complicated, and we're not taught much of that in medical school, and you really can't get that from up to date or Harrison's or any of our great things we rely on. That is wisdom of experience. Immersing yourself in other places that test your moral flexibility, sophistication, implications, shades of gray, that is what really what being sick is all about. And I think we often fall short on that. So for that. reason I recommend particularly human, I mean, humanities. I mean, Would playing rugby also give you some great outlet? Sure. But I don't think it's quite the same.
Christine Ko: Yeah.
Danielle Ofri: I can give my plug for Bellevue Literary Review. To read great literature, and this is short fiction, nonfiction, poetry about the health and illness and healing and ways that explore lots of these issues beyond what we've experienced as a clinician. It gives you ways to think. And I've had many doctors say reading some of these stories or these poems, it gives you an insight to other aspects of the same illnesses I deal with every single day. You learn about patients experiences, families, clinicians, [00:27:00] other grapplings with that. And they can only add to our sort of repertoire of things to lean on when we're faced with a patient in a complicated, ambiguous, difficult situation.
Christine Ko: Yeah, that's great. Yes. Okay. So everyone should try to subscribe to Bellevue Literary Review. I actually will do it after this. I'm actually not a subscriber yet. I just want to circle back to something you said before which I wanted to ask you about when you said, did I make an error or am I the error?
And you said that it's important to distinguish between those two things. Can you elaborate on that? I'm not actually quite sure what you mean by that.
Danielle Ofri: When something bad happens, and I'll take the example of my missing the brain bleed. So I made an error. If I had completely internalized that I was the error, then I should simply walk out the door because I am a bad physician, unsafe for patients, and I should leave now. And I think that is rarely the case, right? Again, that's the easiest conclusion to draw when you look around, everyone else is doing a great job. It's just [00:28:00] me that was so stupid and so incompetent that I almost killed this patient. And we feel very lonely in that we must be the error because everyone else is doing such a great job. Of course, if you talk to everyone, you'll find that they're all nursing those same things. I've made those mistakes. So recognizing that we nearly all the time are not the error, but we have made an error. Or we're in a situation that made this error more possible or some of that.
Christine Ko: Absolutely. It's been awesome to talk to you. Do you have any final thoughts?
Danielle Ofri: Oh, I hope people recognize, especially if you're outside the medical profession, that medicine is very complex. We know that obviously, but it's even more complex than that. There's so many layers beyond learning. There's 10, 000 diseases out there. That's already complicated. But you can cross match that with, individual patient circumstances, emotions, economics of the time, so many things go into play. And so, I think patients often have the expectation of perfection. [00:29:00] And of course we want to do the best job ever, but recognize that, it is a human endeavor. Patients can be understanding of that if you can explain to them why there's complexity here, why we can't know for sure what the outcome will be. Helping people to manage their expectations can make it easier to understand why things can go wrong, both for patients and for doctors.
And then, read a good book, read some poetry. It'll always do you well.
Christine Ko: Read the Bellevue Literary Review. Thank you so much.
Danielle Ofri: You're very welcome.