Anyone who makes critical decisions needs to be aware of dual process theory and the heuristics we use and are subject to. Dr. Pat Croskerry, Dr. Christopher Chabris, and Dr. Itiel Dror are all experts in critical thinking about how our minds work. This episode is also available for Yale CME credit (1.0 hour). Dr. Pat Croskerry, MD PhD, is a professor in emergency medicine at Dalhousie University in Halifax, Nova Scotia, Canada. For the past 10 years, he has been Director, Critical Thinking Program, Division of Medical Education, Faculty of Medicine, at that same institution. Trained as an experimental psychologist, Dr. Croskerry went on to become an emergency medicine physician and found himself surprised by the relatively scant amount of attention given to cognitive errors. He is one of the world's foremost experts in safety in emergency medicine and in diagnostic errors. He is humble, honest, and thoughtful; read this interview for more insight into his background and work in the emergency department (https://psnet.ahrq.gov/perspective/conversation-withpat-croskerry-md-phd). Other recent key links to his work include https://www.bmj.com/content/376/bmj-2021-068044/rr-1 and The Cognitive Autopsy (https://www.amazon.com/Cognitive-Autopsy-Analysis-Medical-Decision/dp/0190088745/ref=sr_1_1?crid=2UQIRFBZTX6JH&keywords=croskerry+cognitive+autopsy&qid=1648025342&sprefix=%2Caps%2C109&sr=8-1). Dr. Christopher Chabris, PhD is one of the originators of a famous psychology experiment (link: https://www.youtube.com/watch?v=vJG698U2Mvo); he is currently Professor and Co-director of the Behavior and Decision Sciences Program at Geisinger Health System. His book with Daniel Simons, The Invisible Gorilla (link: http://www.theinvisiblegorilla.com/) is a bestseller that goes into much greater depth on the everyday illusions of attention, memory, confidence, knowledge, cause, and potential; as well as the myth of intuition. Dr. Itiel Dror PhD is a senior cognitive neuroscience researcher at the University College London. He received his PhD in Psychology at Harvard University. He researches information processing involved in perception, judgment, and decision-making. Dr. Dror has dozens of research publications, which have been cited over 10,000 times. His academic theoretical work is complemented with applied worked in healthcare, policing and aviation --to improve human training and decision making. More information and publications are available here. Links to some papers: 1) Short piece from Science, 2) A bit more 'meat' explaining bias sources & fallacies, 3) A 'solution' too, and 4) 'Hot off the press', just published, a new paper on forensic pathology decisions.
[00:00:00] Christine Ko: Welcome back to See, Hear, Feel. I'm very excited to be speaking to Dr. Pat Croskerry. Dr. Pat Crosskerry has an M. D. and Ph. D. degree. He is a Professor in Emergency Medicine at Dalhousie University in Halifax, Nova Scotia, Canada. For the past 10 years, he has been Director of the Critical Thinking Program in the Division of Medical Education at the Faculty of Medicine at that same institution. He is trained as an experimental psychologist and is also an emergency medicine physician who found himself surprised by the relatively scant amount of attention given to cognitive errors in medicine. He is one of the world's foremost experts in safety and emergency medicine and in diagnostic errors. There is a link in the show notes to an interview with him for more insight into his background and work in the emergency department. There are also other key links to his work, including a British Medical Journal article as well as a recent book of his titled, The Cognitive Autopsy. I highly recommend reading that book if you're interested in learning more about how our minds work, specifically in medicine. There is also even a dermatology case which was exciting for me. Welcome to Dr. Croskerry.
[00:01:12] Pat Croskerry: Thank you, Christine. Pleasure to be here.
[00:01:14] Christine Ko: Can you tell me something about yourself that your biography doesn't illuminate?
[00:01:19] Pat Croskerry: I'm married, and I have one son who's actually a medical student. I feel as though he's allowing me to stay in touch with modern medicine . I'm pretty much retired from emergency medicine now.
[00:01:31] Christine Ko: You wrote recently in a BMJ editorial, and I said the link is in the show notes, that a curriculum in clinical reasoning should be taught in all phases of clinical education. Can you speak about how metacognition or dual process theory and emotional intelligence underpin that curriculum?
[00:01:49] Pat Croskerry: If you talk to psychologists about affect, which may be positive or negative, or it may be neutral, the psychologists say that we're usually placed somewhere on an emotional spectrum, whether it's you slightly like somebody, just on the basis of their appearance or what they're saying, or you may slightly dislike them, or you may polarize even further along that continuum.
[00:02:14] Psychologists call that the affect heuristic, that actually we tend to make decisions based on our emotions, which are largely unconscious to us. If you think about that, that's a pretty daunting revelation. So the problem arises that as an emergency physician, I may approach a patient, and they may trigger a negative affect.
[00:02:38] And without being consciously aware of it, my decision making may change in a biased way for no good reason. The second problem they mention is that your first reaction to a patient, not only is it influenced by this affect heuristic, but also, it tends to influence what follows. Because, as you initiate your decision making process, it tends to trigger what follows. This has been discussed in the psychology literature, and they've referred to it as a cascade effect. That the first bias tends to trigger more bias. And it's also been referred to as a snowball effect. Affect is critically important, underpins a lot of decisions.
[00:03:23] Medicine has to undergo a similar kind of epiphany as the one that they went through with statistics. They recognized the essential need for proper statistics, and what we're asking for now is that they recognize the essential need for an appreciation and a training in cognition in medicine. I want to make that parallel between how statistics emerged in experimental medicine, which was absolutely essential. And I'm arguing now for the emergence of cognitive training in medicine as part of our standing curriculum.
[00:04:01] Christine Ko: That makes a lot of sense to me what you just described because definitely when I was a medical student, I had a statistics course, for sure. But I would say I have never taken a course, ever actually, in decision making. What are some of the key concepts in cognitive psychology, that cognitive psychologists take for granted, that you would want to see in such a curriculum?
[00:04:26] Pat Croskerry: In medicine, the biases we want to talk about are called JDM biases, judgment and decision making. These are normal characteristics of human cognition, that you will tend to exhibit, anchoring, availability, confirmation bias, and all those others. There's lots of them. We want to get across very much the idea that these JDM biases, judgment and decision making biases, first of all, they're universal.
[00:04:58] Everybody demonstrates them, whether you're a doctor, or a lawyer, or a builder, or a politician. We all do it. We're not necessarily aware of it. That's the second major problem, is that a lot of these biases, judgment and decision making biases, they occur subconsciously, and they influence fairly significantly what we do.
[00:05:19] And most people don't have an insight into how they actually happen. A lot of us are being manipulated by these biases unconsciously, subconsciously or unconsciously. I think most people aren't aware of it. People have even argued in medicine that doctors aren't vulnerable to biases, to these kinds of biases.
[00:05:40] And I find that shocking that we should be considered different in the fundamental characteristic of human behavior. Why should doctors be any different? There's nothing in our training that actually trains us to be objective. and de biased and aware of biases, but quite the opposite. There's an absence of training.
[00:06:02] Christine Ko: What you just covered is that a curriculum would address JDM bias, so judgment and decision making bias, and cover some of the most important biases that doctors especially are affected by. Would it also be important for the curriculum to address metacognition and System 1 or Type 1 and System 2 or Type 2 thinking?
[00:06:22] Pat Croskerry: We train medical students here at Dalhousie in decision making, and we start with dual process theory. We say, first of all, we're going to tell you that you don't know anything about decision making unless you can tell me what dual process theory is. There have been lots of theories about how people make decisions. But dual process theory is probably the most powerful. It's the most coherent and the most widespread. People now know about it. You hear politicians refer to it, lawyers, people in business, and so on. It's become a very well founded and grounded model for how we make decisions. We first of all say, you need to know about this.
[00:07:03] You need to know that there are two major processes by which we can arrive at decisions. You can start to say, well, how does System 2, which is conscious, deliberate, analytical, how does it monitor System 1, which is unconscious, impulsive, and something that people aren't particularly aware of. So, we can now bring in the concepts of mindfulness, metacognition, which is just thinking about how you're thinking, understanding that we may make decisions in different ways.
[00:07:37] You can start to say, well then, what are our metacognitive processes? And they are things like reflection, which is trying to pull back from the immediate situation and focus on what you're thinking and what sort of impact the patient is having on you, and mindfulness, just being aware of what impact you have in this decision making with the patient. The model describes how that metacognitive step occurs, from System 2 to System 1.
[00:08:08] Christine Ko: In The Cognitive Autopsy, you describe four major categories of intuitive thinking. The System 1 thinking you just mentioned, that fast, hardly thinking, almost like a gut reaction type of thing. Can you break those down a little bit?
[00:08:21] Pat Croskerry: First of all, just for the record, we don't use the term thinking. I know that Daniel Kahneman does, but we don't actually say thinking in System 1, we say decision making. Because if you look up any definition of thinking, it implies some sort of deliberate act. If I say to you, what were you thinking? You immediately start to think, what was my brain actually doing while I was thinking? And you don't usually say, Well, I guess it was just an impulse. Because impulses and rapid responses from System 1 don't necessarily involve any thinking. Often no more thinking is attached than there is in the spinal reflex. You hit them with some sort of association, they make a response. To our minds, that's hardly thinking. That is decision making, but not necessarily thinking. So, in System 1, Keith Stanovich, Maggie Toplak, and Richard West, a collection of three individuals, have worked very much on dual process theory and how System 1 gets established.
[00:09:22] They actually translate that pretty much into rationality and say the more people are good at resisting these impulses and decisions that are made in System 1, the more likely this person is to be rational. That's their first claim, and I think it's a good one. They say that the origins of decision making in System 1 come from four sources.
[00:09:46] The first one is, we're born with it. It's in our DNA. So if you go back to our prehistoric past way back, certain patterns in our behavior were established that facilitated our survival. Say for example, a famous bias in medicine, one that's common in radiology, is called search satisficing, where once you find something, you tend to call off the search for anything further.
[00:10:12] So they say, well, our ancestors from 200,000 years ago, if they were looking for food, or they were looking for shelter, or they were looking for a mate, somebody to have children with, they would probably stop as soon as certain criteria were satisfied. That is translated into a modern version of search satisficing, that as soon as you find something, you tend to call off the search. So that's the first category of biases. These are actually in our DNA.
[00:10:42] The second category is associated with emotion, and we've talked a bit about this already. That emotional tagging of biases can certainly, as we demonstrated earlier, can certainly influence the decisions that you make.
[00:10:55] The third category is called explicit learning, which is I teach you certain ways to respond to certain things, and that later on when you're in that situation, you make the predictable response. It happens in medicine too. You've explicitly learned certain behaviors that are acceptable and are not.
[00:11:13] And then the fourth category is called implicit learning, where this is a little bit more difficult to get across. But if you talk to, say, a coach of rowing, they say things like, time spent in the boat actually teaches you about how the boat works. Even though nobody else may be there to point out to you what's going on, you develop a kind of muscle memory or cognitive memory for what the boat's doing that becomes implicitly established as part of your repertoire, even though you haven't necessarily been instructed in anything explicit.
[00:11:50] So those are the four major categories.
[00:11:53] Christine Ko: I think you explain this in your book as well, the implicit category, that fourth category you said, where you're describing a rower being in a boat, you just learn kind of the culture. To me, that relates to the hidden curriculum of medicine, the culture of medicine, which tends to hide mistakes, tends to actually believe, as maybe the world does, that doctors are rational and that, as you mentioned before, somehow we're not prone to these JDM biases that everyone is.
[00:12:22] The hidden curriculum suggests that we I already know this stuff, but I think the hidden curriculum actually has taught me that I really don't know this stuff.
[00:12:33] Pat Croskerry: I think you're absolutely right. Diagnostic failure was a victim of the hidden curriculum. That when I came into medicine, people never discussed their failings particularly. Diagnostic failure in particular was not discussed. I think we were simply unwilling to admit that we occasionally get it wrong. Not so much in dermatology, probably down around 2 or 3 percent, but in emergency medicine and internal medicine probably around 15 percent.
[00:13:02] Christine Ko: If I'm reading a hundred slides, two to three percent of the time, if I'm average, I would be making a mistake. That's really an uncomfortable thought. I circle back to metacognition and this JDM bias and learning about it because I find comfort in it that I can improve my process of thinking and diagnosis since diagnosis is thinking. Do you have any final thoughts?
[00:13:30] Pat Croskerry: Just recently, a doctor here made a mistake on a child's diagnosis. Now it could happen to anybody. If you know cognitive science and you know how dual process theory works and you know something about the various traps we fall into, then you would say, well, you know, this is why it happened. But that physician tried to take his life a few days later because he was so distressed by what he'd done. The thing is he didn't really do anything other than end up on the wrong side of the decision. I think if we understood the process better, we wouldn't be so hard on ourselves. I think we should be hard on ourselves if we get it wrong, but I don't think we should be driven to those extremes.
[00:14:14] Christine Ko: Well, thank you very much for spending time with me today. I really appreciate it.
[00:14:18] Pat Croskerry: My pleasure. Thank you.
[00:14:19] Christine Ko: The complex interplay between intuition, or System 1 or Type 1 processing, and potential pitfalls is precisely what underscores the notion of the myth of intuition, where initial intuitions can shape the direction of decision-making, but they must be confirmed and balanced with more deliberate analysis, or System 2 processing, or Type 2 processing, especially in very challenging and critical scenarios, such as we might encounter in medical diagnosis or treatment.
[00:14:49] I spoke with Dr. Christopher Chabris, who was involved with one of the most famous psychology experiments, which you can find on YouTube if you just Google basketball, experiment, psychology, and you'll find it. And he also has a book where he talks about five different pitfalls and myths. Five main illusions or myths, illusions of attention, illusion of memory, illusions of confidence, illusions of knowledge, and illusions of potential. And what follows is my conversation with him.
[00:15:21] Dr. Christopher Chabris is a Professor and Co-director of the Behavior and Decision Sciences program at Geisinger Health System. He was formerly an associate Professor of Psychology at Union College in New York. He has an AB in computer science and a PhD in psychology, both from Harvard University, and he worked at Harvard subsequent to that for about five years. He's coauthor of a really great book called The Invisible Gorilla, which really I can't recommend highly enough for anyone who is in a profession where attention and confidence and knowledge are important. He is amazingly a chess master among all of that. For more information on him, you can visit his website, which is in the show notes.
[00:16:00] I'll jump right in and just ask you what everyday illusion do you think people are most susceptible to?
[00:16:06] Chris Chabris: So that's a good question, but I don't think there's really an answer to it because when we talk about illusions of attention, memory, confidence, knowledge, and potential, which are five of the main ones we talk about, it's hard to quantify, like, how much of one there is versus another.
[00:16:21] I think that the main point is that we are all susceptible to all of these mistaken beliefs about how well our own minds work. An everyday illusion is a mistaken belief about how your own mind works in all kinds of activities. We have examples from law and order, well, not the TV show, but you know, the world of the criminal justice system, sports, finance, healthcare, games, politics, pretty much anything there is, because these are sort of fundamental misunderstandings that we have about how our own minds work.
[00:16:49] So I wouldn't say there's one that people are most susceptible to, but I think one that people often don't realize maybe as much as others would be the illusion of confidence. That's in part because the idea that we pay attention to and believe people who are more confident is not necessarily irrational.
[00:17:08] Usually there is some relationship between how well you understand something, how much you know about it, how accurate your memory is, and so on, and how confident you are about those. It's not like there's zero relationship between them. So, for example, in eyewitness testimony, more confident eyewitnesses are, on average, more accurate about their identifications of suspects than less confident eyewitnesses.
[00:17:31] The problem is they're not perfectly accurate, right? So they're still overconfident. The correlation is not as good as we'd like it to be, and yet juries will convict defendants sometimes based on just one compelling eyewitness. There's a reasonable chance that eyewitness could be completely wrong for a whole variety of reasons that have to do with how visual perception works, and memory for faces, and stress, and trauma and all kinds of stuff like that.
[00:17:52] But they all combine to make our memories not as good as we think they were and make our confidence too high that can play out in a lot of different ways. I think it's one of the ones that has like the biggest effect in a wide variety of our of our everyday experiences.
[00:18:05] Christine Ko: Yeah. I really like this quote from your book, which says we take the fluency with which we process information as a signal that we are processing a lot of information, that we are processing it deeply and that we are processing it with great accuracy and skill.
[00:18:19] But effortless processing is not necessarily illusion free. That really speaks to someone like me who has been working in the fields of dermatology and dermatopathology for a long time now, which takes a lot of visual attention, recognition, perception. Sometimes I feel something's easy, but that doesn't necessarily mean ease or fluency translates to being right.
[00:18:41] Chris Chabris: Yeah, we wrote that in the book in part because we were looking to figure out why do people make these mistakes, right? What are they paying attention to in their own thought processes that could lead them to think that they're doing better than they actually are.
[00:18:56] And I think part of the reason is fluency as you mentioned. So fluency in the cognitive psychology world, it's an internal experience of how quickly and readily and plentifully your thoughts happen. It's analogous to like your fluency with a language, right?
[00:19:10] Like when you're fluent in a language, you can speak it more quickly, you make fewer errors, you understand it better and so on. It's an analogy to that, but it's with our internal thought processes. Some thought processes we have operate very quickly. Danny Kahneman, you know, famously in his book, Thinking Fast and Slow referred to the System 1 and System 2 processes, which is a way of summarizing the fact that some mental processes happen so quickly and so outside of our awareness and so automatically that with the only sense we get is that they are happening instantaneously.
[00:19:40] And that is sort of a signal that we often take. to be indicative that they're operating well. You know, a lot of visual perception and memory retrieval has that quality, right? When we're looking out at the world, we don't experience any difficulty, really, in processing things. Every so often we do, like we see one of those visual illusions, right?
[00:19:56] Where it sort of flips back and forth between two things. Visual illusions are nice because they sort of do reveal some of the boundaries of our perceptual abilities. Even when we're remembering things, sometimes we have trouble remembering things, but often when we don't have trouble remembering things and we do retrieve a memory, It just feels automatic and effortless and we mistakenly then assume that it must be correct. Whereas, in the background, unbeknownst to us our memories are constantly being distorted and updated and rewritten, especially ironically when we retrieve them.
[00:20:24] It's like every time we tell a story we kind of change the memory a little bit. It's almost like playing a game of telephone with ourself. You know, when you pass the message along from one person to one person to one person, well, the more often you tell a story, it's like you're passing it back and forth to your own mind, and it gets distorted, right?
[00:20:38] Like, whatever, whatever part of it you emphasize this time, you know, might become, like, a more solid part of it for the next time you retrieve it, and parts you don't mention this time sort of drop off, and maybe something you accidentally add this time gets put back into the memory. So then the next time you tell the story, it's got something it didn't have before.
[00:20:54] But we don't really notice any of that stuff. That all happens like in the background automatically. And we have a fluent experience of memory retrieval or visual perception, and therefore we think we're noticing everything and remembering things accurately. I think that's not the only reason we have these problems, but it's an important one that our fluency doesn't always give us an honest signal.
[00:21:11] Christine Ko: I'm glad you mentioned Daniel Kahneman's book, Thinking Fast and Slow, because that's another, one of my favorites. It's actually the book that really did first get me thinking about metacognition. What I found interesting about that is he points out that expert thinking, so the thinking that doctors are trying to achieve through training through med school, residency, et cetera, fellowships, if they do one, and then practice is to get that fluency, that expert thinking or System 1 thinking, Type 1, which is hardly thinking. Sometimes we just easily retrieve something. It just pops up. We don't really have to think hard about it versus System 2 thinking, which is more analytical, logical. Would you say that these everyday illusions are more part of System 1 or Type 1 thinking?
[00:21:54] Chris Chabris: Well, I think they definitely are related to it. Reflecting on your own thought processes and asking, what am I missing? How could I be going wrong? And so on. That does seem to be inherently sort of a System 2 process. So some of the many words that are used to describe System 2 thinking are reflective and analytical and sequential and slow. Thinking about your own thought processes and noticing when they might be making errors, those seem inherently sort of slow System 2 processes.
[00:22:20] And also System 2 processes are often hard to initiate. Often they seem to take some kind of special mental effort. They are effortful, they feel effortful, unlike the fluent System 1 processing. So I would say, certainly checking your own intuitions and instincts and so on is more of a System 2 kind of thing.
[00:22:36] The problem with everyday Illusions is that the marvelous output we get from System 1, which is able to do amazing things like recognize the face of people we haven't seen in decades, remember things we haven't thought about in years, recognize thousands and thousands of different patterns on your slides or your x rays or your chess boards or whatever. We don't have any insight in real time into what's going on there. That is where we become susceptible to the illusion that whatever we've noticed is everything there is and whatever we've remembered is accurate. Whatever our first instinct was must be the correct answer.
[00:23:08] Christine Ko: Yeah, in your book, you also end with what you call the myth of intuition, which I think is what you just touched on, that with experience, especially in chess or in healthcare, making a diagnosis, part of expert thinking is this Intuitive sense of I've seen this before, but then if we're wrong, inadvertently missing, maybe other things that would steer us differently.
[00:23:31] Chris Chabris: Yeah, the reason why we called it the myth of intuition was that it's definitely true that experts can recognize patterns and recognize them instantly that amateurs, novices, less experienced people can't. That's one of the main engines of expertise, of skill, that enables one person to be much better at something than someone else.
[00:23:49] It's not natural talent, it's practice and training and the acquisition of expertise, and that's been documented in cognitive psychology in all kinds of fields from medicine to chess to other games to sports to pretty much anything. And it's quite well understood how that happens, that you gradually build up a vocabulary of different things. They might not be vocabulary in words, right? But their vocabulary is in visual patterns or associations between things and so on and that's what enables a doctor to say that kid looks sick even if they can't immediately tell like what's wrong with them or whatever or to maybe feel like they have a sense before they know the diagnosis that they're going to figure it out.
[00:24:23] Of course, it's often good to confirm your intuition later on, right? Where the myth of intuition comes in is, is where it's twofold, I think. A, the intuitions happen first before the confirmations happen. So if you stop before the confirmation, you're left only with the intuition, but you might not realize that you never really confirmed it.
[00:24:39] And this has happened in many cases. Like the art world, where often you'll have these paintings that are said to be done by a famous artist, and the way they're authenticated is just by showing them to experts and the experts look at them and often the experts just instantaneously say, yes, that looks exactly like, you know, a Jackson Pollock, and maybe they'll look a little bit more, but there are other ways to confirm that intuition.
[00:25:00] But people often take the expert intuition as the final judgment. In medicine, people are trained to look at laboratory tests to confirm things, to get a radiologist opinion, to send it to the pathologist, and so on, but still the initial intuition, I think, can have a big impact on the further direction. Even experts can be wrong in their initial intuition. Coming from the domain of chess, where I have a lot of experience, often you will see the best players in the world, literally the best players in the world, top 10 players say, Oh, I didn't even think of that move.
[00:25:27] And on the chess board, there's nothing easier for chess players to think about than where the pieces can move, right? That's like the fundamental thing. There's like 30 possible moves. And yet sometimes you just don't even think of one, which turns out to have been the best one or the best one for your opponent or something like that.
[00:25:40] So that initial intuition, that initial pattern recognition is powerful, but it doesn't go all the way. And in fact, when it's wrong, we can pay the price. When an expert chess master says they didn't even think of that move, they're just questioned afterward. And they say that they didn't think of it. Yeah.
[00:25:55] People play chess at all different speeds. So believe it or not, there's something called 15 second chess, where you have 15 seconds for the whole game. And then there's something called correspondence chess, where a game could take two or three years. And there's every possible speed in between, even at sort of elite professional tournament speeds, which is let's say five hours for a game.
[00:26:11] And people spend like 20 minutes thinking about a single move. They'll come back afterwards and say, I made a mistake here because I never even considered that option. And I think that happens in a lot of professional decision making. Chess just gives me a window into it, where I'm sure there are many cases where the ultimate diagnosis was something that the first person who saw the patient never thought about.
[00:26:28] They're like, you know, like the New York Times column on diagnoses often, right? The solution is like not anything that the first doctor or the second doctor or the third doctor even thought about. Medicine is hard. But if we pay attention to confidence, if we believe too much in intuition and so on, you might go with whatever the first person said.
[00:26:43] And really, like 90 percent of the time, that's right. It's just the other 10 percent of the time, an important 10%, right? The hard cases are hard, you know, for a reason, but they're just as important to get right as the easy cases.
[00:26:53] Christine Ko: Yeah. Is there a way around it?
[00:26:56] Chris Chabris: Well, wish that there were a way around it... we have to respect expertise, but at the same time, we have to recognize the limitations of expertise. Expertise is not just being able to instantly classify things correctly, but it's also understanding what other kinds of information you need to come to a definite conclusion. There's a famous quote from Keynes or something like that, where he said something like, when the facts change, I changed my opinion, sir. What do you do? I think experts should be expert in that, right? If the pathology report says something they didn't expect, they don't just throw it in the garbage and carry on, right? They integrate that into their understanding of the decisions they're trying to make.
[00:27:28] And that's a struggle because once we form a belief, it's hard to just let it go, right? So that's a struggle that requires training, reflection. Right? System 2 reflection and, you know, an analysis and so on. And I wish I had a magic formula for that. But, you know, first step is knowing about it.
[00:27:43] Second step is designing our systems around us to sort of force us to do that. And I don't think I have any magic bullets for health care, but often just prompts at people to reflect things and to consider alternatives, reflect on things and consider alternatives can be helpful.
[00:27:57] Christine Ko: Right. I love that. I love how you summarize that with: respect an expert, but also recognize the limitations of expertise. That's great. Do you have any final thoughts?
[00:28:08] Chris Chabris: One final thought that I do have related to what you mentioned earlier. We at Geisinger have a behavioral insights team that Michelle Meyer and I started. We really enjoy collaborating with clinicians and other people inside healthcare because it's such a rich environment for trying to help people make better decisions. I think there's so many opportunities, not because everyone's making terrible decisions in healthcare, but just because the situation is so complex and there's so many behavioral components to it.
[00:28:33] And I think we often neglect the behavioral elements. We focus a lot on, does the drug work? Does it not work? How big an effect does it have? What combination of treatments is best? And so on. Even when you have the best and fastest developed life saving vaccine in human history, we found that there are huge behavioral gaps in getting the maximum healthcare value out of that.
[00:28:55] And that, I think, has showed us what is true in the rest of healthcare also, and in more extreme form, right? But we all, we know that there are all kinds of other things that are behavioral. That make a difference, and so we're really focused on trying to apply this kind of science and knowledge and ideas to helping improve outcomes for the system, and the patients, and the clinicians, and everyone involved. So excited to be working in this field. I love it.
[00:29:17] Christine Ko: I'm such a fan of your work. And one of my passions is getting better health care for patients. So I think the work that you're doing now is just so valuable. Thank you again for spending your time with me, Chris. I really appreciate it.
[00:29:30] Chris Chabris: Oh, it's been a pleasure. Thank you.
[00:29:32] Christine Ko: The power of intuition, harnessed by experts through pattern recognition, is a fundamental driver of expertise in fields like medicine, chess and art authentication. However this initial intuition, or System 1 processing, can be fallible, as even the best professionals may overlook crucial factors or alternative options in their first assessments. This highlights the critical need to understand the role of cognitive biases, which are inherent in human decision-making, in order to discern between helpful biases that enhance our performance and more detrimental biases that need mitigation to prevent errors and misjudgments. What follows is my conversation with Dr. Itiel Dror, who has spent much time and effort researching different cognitive biases in different fields.
[00:30:21] Dr. Itiel Dror is a Senior Cognitive Neuroscience Researcher at the University College London. He received his PhD in Psychology at Harvard University. He researches information processing involved in perception, judgment, and decision making. Dr. Dror has dozens of research publications, which have been cited over 10,000 times. His academic theoretical work is complimented with applied work in healthcare, policing, and aviation. All of his work is really about improving human training as well as decision making. This type of work in decision making is so relevant to everyone, but especially doctors. More information on his work as well as his publications are available in a link in the show notes. Also, there will be links to some of his papers as well, including one in science and other articles that talk in more depth about sources of bias, solutions of bias, and a paper on forensic pathology.
[00:31:21] Welcome to Itiel.
[00:31:22] Itiel Dror: Thank you.
[00:31:22] Christine Ko: Would you be willing to share a personal anecdote about yourself as the first thing to just help listeners get to know you a little bit better?
[00:31:31] Itiel Dror: When I do research and present my findings, especially to forensic examiners, but also other experts, medical, aviation, policing, things that seem very clear and obvious to me are either mind opening or outrageous for the listeners. So they all open their eyes and say, Oh my God, this is amazing. Or they say, I'm an idiot. The research is wrong. They want the paper retracted and they get very defensive. I'm just amazed that when I talk to scientists, to educated people about scientific findings, they're not able to respond to the scientific findings.
[00:32:12] They get very emotional and they don't see the research, which of course connects to the topic itself, how emotions and feelings and expectations and beliefs impact and bias. So it's important to really take cognitive biases seriously and be willing to acknowledge your weaknesses and listen and debate and then make decisions based on science and not emotions.
[00:32:37] Christine Ko: You brought up emotions and how emotions really color the way people think. And there is a bias called the affective bias, which I think is talking about how emotions do really influence our thinking. Not being a cognitive scientist, I don't know that emotions are really separate from thinking. I know that traditionally they're separated, but... can you add a little bit?
[00:33:01] Itiel Dror: We're all familiar with the saying, love is blind.
[00:33:05] Christine Ko: Yeah.
[00:33:05] Itiel Dror: So, I got an email last night from a friend I haven't talked to in many years. We were very good friends. One day he said to me, I met the woman of my dream. She's beautiful, she's intelligent, she's funny, she loves me to bits. I said, great, I want to meet this woman. So we went to dinner, her, him, and me. And then after dinner he said, what do you think? And as a friend, I had to be honest. So I said to him, I don't think she's very smart, or funny, or good looking, and I think she's a gold digger. I don't think she loves you so much. Did he come and say, thank you Itiel, I'm not going to marry? No! He was angry at me! Didn't invite me to the wedding. Now, ten years later, he sent me an email last night, he said, Itiel, I need to apologize to you. I'm getting a divorce. All what you said was true. And he said, it's not that she's changed. She was like that before we got married. I just didn't see it. This is exactly a illustration of love. It's blind. It distorts how you evaluate. Your emotions are part of the thinking process, and they're not separate.
[00:34:09] Like bias, it's not emotions are bad, right? It's a weakness not to have the emotions to help guide you. Again, like bias, it's a complicated picture.
[00:34:18] Christine Ko: Can I ask you to define cognitive bias?
[00:34:21] Itiel Dror: In a nutshell, the human mind is not a camera. We are not passive in processing information because of the brain architecture and limited computational resources. So the brain pays attention to some information more than other information with our decisions and our perception is not only based on the input, not based on only what the eyes see. The brain distorts it and manipulates it. We distort and change how we process the actual data and make decisions that even what we perceive as the data is not based only on the data but other factors that have got nothing to do with this information. So in your domain, you're perceiving a slide and interpreting a slide with melanoma, not based only on the slide, but a lot of factors. And it's not one, it's a whole range of forces that impact what we see and how we see and how we interpret it. And these are the cognitive biases.
[00:35:19] Christine Ko: I'm not a cognitive scientist, for sure. And I realized that I'm biased to think that cognitive bias is bad. And then I'll somewhat quickly remember, Oh yeah, wait. Cognitive bias is not always bad. And actually, I think it's mostly useful and positive. I think it's hard. It's an availability bias, right?
[00:35:41] Because if we're always thinking biases about racism, sexism, the cognitive bias, you're like, Oh, it must be like racism and sexism and et cetera. And so you're already biased against thinking about cognitive bias, because if you consider yourself not racist, you're gonna be like, Oh, I'm not biased against race. So I'm not biased cognitively, because I'm someone who's not biased. So I think just to emphasize that context you've given, every thinking person has cognitive bias, because cognitive bias is part of thinking. It's the way we think.
[00:36:16] Itiel Dror: Absolutely. First of all, not to annoy some audience who are listening and saying that I will not give a definition and they have to have one. I'll give you the definition that we have in a paper from 2013, and we define cognitive biases as the class of effects through which an individual's preexisting beliefs, expectations, motive, and situational context influence the collection, perception, and interpretation of information. So that's the definition.
[00:36:51] Before we talk about biases, are they good or bad, they exist and we can't avoid them. We need to ask which ones are really bad, which ones are good, which ones are a bit bad, which ones are a bit good, then take actions to minimize and try to even eliminate some of the real bad biases and live with some that may be a bit bad and harness the better biases.
[00:37:18] Biases develop for a good reason. If you don't have biases, you're going to be paralyzed. Biases have developed for a very good computational brain architecture reason. We cannot do without them, and we don't want to do without them, because they're selective and help us do what we want to do and achieve and be effective and intelligent and experts in our domain.
[00:37:45] We need to understand which biases are helpful, which biases are not helpful, and which biases are bad. Of course it's a bit more complicated because some biases are very good in certain situations, but sometimes they are bad. So it's like driving on the road. Maybe I shouldn't say that I speed when I drive, but when the road is wet, when there's ice on the road, when I'm tired, I slow down.
[00:38:13] You need to know where you're on the slippery road, when there is fog or ice on the road, and you need to slow down. And take measures not to have an accident, to take measures that the biases will not lead you to make erroneous decisions. Awareness and willpower do not change the biases, they're unaffected by that. So we need to take this issue of biases and discuss it in medical school and among ourselves, and to do research and to seek and to contemplate. Then, we're moving forward.
[00:38:44] Christine Ko: Yes, I think I would have very much benefited from having some exposure to metacognition and cognitive bias and what that can mean for the practice of medicine.
[00:38:58] Itiel Dror: Medical decision making, diagnosis, is what you do day in and day out, or in surgery when you make decisions and how time pressure affects decision making. I'm expanding now from bias, much bigger. Cognitive process, perception, judgment, and decision making should be a whole course in any medical school. We need to have it mandatory. It's part of patient safety, which we do care about.
[00:39:22] The whole healthcare system, at least in the U. S., is becoming adversarial because doctors are more and more afraid of being sued of malpractice. I think they are afraid of admitting to error.
[00:39:36] Christine Ko: I agree.
[00:39:37] Itiel Dror: It's not only what you say. I think it's much, much worse. For example, it's not that the medical doctors will not acknowledge your mistakes for lawsuits. I would say, this is going to be very provocative, that the decision of how to treat the patient takes into account, maybe without their awareness, the implications later for a lawsuit. The fear of a lawsuit contaminates a decision. It impacts the actual treatment they give patients, because in the background, they say, if I do this and it goes wrong, I'm well protected for a lawsuit. But if I do this, which is a better medical treatment, and something goes wrong, and it goes to a lawsuit, then I have a problem. The more I understand human decision making, the more I see that people make decisions based on ideology, motivation, personality, their experiences, and the data is less and less important.
[00:40:35] Christine Ko: To move forward, to improve, what's really necessary for human beings, but also doctors is feedback. I feel for me in particular, as a dermatologist and dermatopathologist that I don't get enough feedback. There are a certain number of my own cases that are sent out for a second opinion. Another way that we get feedback at my institution, we have grand rounds with patients. And so oftentimes, relatively soon after we give a diagnosis for a given patient slide, that patient might be brought to the grand rounds. And so then we do get feedback from that, from seeing the whole context, the photographs, et cetera, the medical history, if we haven't gotten it already. But I think the vast majority of my cases just go out into the world. And I assume I'm right. I hope I'm right. But I would say that I don't really get feedback on the majority.
[00:41:33] Itiel Dror: Let's tease it apart. You said you want feedback. Most people, when they tell me they want feedback, they want to hear how great they are. And if they don't get positive feedback, they get defensive. The problem is that even though it's great to hear good feedback, we learn from the negative feedback, if you're not defensive, this is where we actually learn the most. We need that to improve.
[00:41:56] Christine Ko: I agree, most people want positive feedback, but I think you're absolutely right. I learn the most, I realize, from my mistakes. I definitely do not love it when I make a mistake. I realize I feel a fair amount of shame. And it's very uncomfortable. I want to turn away from shame. Most people want to turn away from shame. When I feel shame, people want to turn away from me and they often do turn away from me. I actually am learning to embrace the errors. Not that I want to make more errors, but I'm learning to embrace them because that does really make me better.
[00:42:27] Itiel Dror: If you embrace them, you will make less errors in the future. I've suggested to use linear sequential unmasking. Linear sequential unmasking says very clearly, given that the order of information is important, the first piece of information causes you to generate ideas and hypothesis and impact how the brain perceives and interpret subsequent. And even though we try to reserve judgment until we're exposed to everything, we can't.
[00:42:55] We see something, the brain starts activity of the neurons and hypothesis and so on. Let's think, what is the order of seeing the information, what is the right sequence? Linear sequential unmasking suggests how to optimize the order. For example, let me ask you, when you're looking at the slide, do you look at the slide first, or before you look at the slide, do you read the context, the family history, and everything? Linear sequential unmasking just asks the experts to consider what order to look at the different pieces of information.
[00:43:29] Christine Ko: I'm glad you brought up your concept of linear sequential unmasking because I was not familiar with that term before reading some of your work. For dermatopathology as well as dermatology, I think that we do use linear sequential unmasking. Oftentimes people will say, look at the patient without getting the history. So you're not. biased, or, look at the slide before you read what's on the biopsy requisition form so you're not biased. And then look at it and revise what you're thinking based on what the patient says or laboratory tests or other data.
[00:44:05] Itiel Dror: The only thing I would say, it's not that you're not biased. You're always biased. The question is to minimize and decide which biases you want to take first versus later, which biases are more negative and which ones are not negative and may even help you. You're always going to be biased.
[00:44:22] The question is which biases in what order and what we can do about it. I do also work in marketing and branding in branding and advertising. It's about increasing biases. I want people to stand in line and to pay a lot of money for this product, not because it's good, not because it's cheaper. Telling people how good the product is will not make them buy it. People don't make the decision based on how good the product is. To me, it depresses me. People are so ideological and emotional. It's a big challenge, but we have to try to do something about it. In the medical domain, how do we incorporate emotions and bias in medical training and medical decision making?
[00:45:05] Not tell the doctor, don't have emotion, try to fight your bias, but acknowledging the role of biases and emotion in decision making, harnessing the positive element and minimizing the negative element. So I'm trying to be positive towards the end. It's very hard. I'm making an effort to end on a positive note.
[00:45:24] Christine Ko: That's perfect. Thank you so much for spending the time with me.
[00:45:27] Itiel Dror: Thank you.
[00:45:28] Christine Ko: I hope these conversations have increased awareness on dual process theory, which is divided into System 1 or Type 1, and System 2 or Type 2 processing, as well as the significance of our emotions, and the affect heuristic in decision-making. Fluency can generate a false sense of competence in decision-making as well as memory and perception, underscoring the importance of recognizing and managing cognitive biases. Thank you for tuning in. I hope that you will follow and share this episode and this podcast.