Girl Doc Survival Guide
Young doctors are increasingly in ‘survival’ mode.
Far from flourishing, the relentless pressure of working in medicine means that ‘balance’ is harder than ever to achieve.
On the Girl Doc Survival Guide, Yale professor and dermatologist Dr Christine J Ko sits down with doctors, psychologists and mental health experts to dig into the real challenges and rewards of life in medicine.
From dealing with daily stressors and burnout to designing a career that doesn’t sacrifice your personal life, this podcast is all about giving you the tools to not just survive...
But to be present in the journey.
Girl Doc Survival Guide
EP71: Dr. Gerd Gigerenzer on Heuristics
I was confused about "heuristics"! It is always so exciting when I learn something new that makes things clearer to me. According to Dr. Gigerenzer, heuristics are NOT the same as cognitive bias. Cognitive bias describes PAST behavior, often in situations of so-called "risk" (the economic term) (that actually means that all variables are known). Heuristics guide what can be DONE in the FUTURE and are helpful in situations of uncertainty. Dr. Gigerenzer is an international expert on judgments under uncertainty, and this is part 1 of my conversation with him on heuristics. Dr. Gerd Gigerenzer, PhD is Director of the Harding Center for Risk Literacy at the University of Potsdam, Faculty of Health Sciences Brandenburg and partner of Simply Rational - The Institute for Decisions. He is former Director of the Center for Adaptive Behavior and Cognition (ABC) at the Max Planck Institute for Human Development and at the Max Planck Institute for Psychological Research in Munich, Professor of Psychology at the University of Chicago and John M. Olin Distinguished Visiting Professor, School of Law at the University of Virginia. Awards for his work include the AAAS Prize for the best article in the behavioral sciences, the Association of American Publishers Prize for the best book in the social and behavioral sciences, the German Psychology Award, and the Communicator Award of the German Research Foundation. His award-winning popular books Calculated Risks, Gut Feelings: The Intelligence of the Unconscious, and Risk Savvy: How to Make Good Decisions have been translated into 21 languages. His academic books include Simple Heuristics That Make Us Smart, Rationality for Mortals, Simply Rational, and Bounded Rationality (with Reinhard Selten, a Nobel Laureate in economics). In Better Doctors, Better Patients, Better Decisions (with Sir Muir Gray) he shows how better informed doctors and patients can improve healthcare while reducing costs.
[00:00:00] Christine Ko: Welcome back to SEE HEAR FEEL. Today, I'm honored to be with Dr. Gerd Gigerenzer. Dr. Gerd Gigerenzer, PhD, is Director of the Harding Center for Risk Literacy at the University of Potsdam, Faculty of Health Sciences, Brandenburg, and partner of Simply Rational, The Institute for Decisions. He is former director of the Center for Adaptive Behavior and Cognition at the Max Planck Institute for Human Development and at the Max Planck Institute for Psychological Research in Munich, Professor of Psychology at the University of Chicago, and John M Olin Distinguished Visiting Professor, School of Law, at the University of Virginia. Awards for his work include the AAAS Prize for the best article in the Behavioral Sciences, the Association of American Publishers Prize for the best book in the Social and Behavioral Sciences, the German Psychology Award, and the Communicator Award of the German Research Foundation. He has many popular books as well as academic books, and his award-winning popular books include Calculated Risks, Gut Feelings, and Risk Savvy. His books have been translated into 21 languages. His academic books include Simple Heuristics that Make Us Smart, Rationality for Mortals, Simply Rational, and Bounded Rationality. He also has a book called Better Doctors, Better Patients, Better Decisions, and he speaks in that book about how better informed doctors and patients can improve healthcare while reducing costs.
[00:01:29] Welcome to Gerd.
[00:01:31] Gerd Gigerenzer: Yeah, I'm glad to be with you.
[00:01:33] Christine Ko: First, would you be willing to share a personal anecdote?
[00:01:37] Gerd Gigerenzer: Okay, so how did I come to academia? When I was studying, I financed the time at the university by playing music. I was a musician. So when I became a graduate student and nearing the final year, dissertation moment where you defended, I had to make a decision. Should I stay on the stage playing music and earning much more money than as an academic, or should I risk an academic career? And I couldn't know, will I ever make it to become a professor? I didn't think very long. I took the risky option, and for me, the risky option was academia, not music, and it worked out.
[00:02:22] Christine Ko: That's cool. I love that story because it's exactly all about your work. Your work centers on risk versus uncertainty. Can you do that briefly now, define risk versus uncertainty?
[00:02:37] Gerd Gigerenzer: That's an important distinction. A difference between risk and uncertainty. The technical term "risk" means a world that is stable, where you know everything that can happen in the future, all the consequences and their probabilities. If you have a well defined situation, that's one of risk, and try to model that or do decision making; try to be as complex and fine tuned as possible. So, these worlds exist. For instance, if you go in the casino this evening and play roulette, then you are in a world of risk. Everything is certain. All that can happen are numbers between zero and 36. And you know the consequences and know the probabilities. And that's why for me, roulette is boring. Most economic theories assume a world of risk. Many cognitive science and neuroscience experiments are made that at least the researcher knows everything.
[00:03:40] I'm interested in a different situation. Namely, how do we make decisions under uncertainty where surprises can happen, where totally unforeseen things may hit you? Under uncertainty, it pays not to fine tune too much on the past if the future isn't like the past. Technically, you overfit the past. So find a few variables that really are important, and that's basically the idea of heuristics. Focus on the one, two, or three important thing and ignore the rest. So my decision, whether to stay on the stage or go into academia was a decision under uncertainty. I could not even have an idea what would happen and what the consequences are, nor could I attach any probabilities to them.
[00:04:37] Christine Ko: This is where the words that we use colloquially can be hard because when you told your story, you said you chose the riskier option to go into academia, even though that was actually the more uncertain option.
[00:04:50] Gerd Gigerenzer: Yeah. That's the everyday term.
[00:04:53] Christine Ko: A lot of the decisions that I make as a doctor are, I think, judgments under uncertainty because I don't have all the data. I can never probably have all the data. I can't see everything. I can't know everything. Can you talk a little bit about judgments under uncertainty?
[00:05:11] Gerd Gigerenzer: First, you're totally right. Most things that doctors do have an element of uncertainty. And most of our decisions are decisions under uncertainty, not under risk. We do have data, but not enough. And so the big difference is that in the situation of risk, all you need is to calculate. You calculate the probabilities, like on the roulette table, you can calculate how much you will lose in the long run.
[00:05:38] Under uncertainty, you can calculate to some point, but it's not enough. You need other things. And these things are what I call heuristics [a heuristic is a simple rule that works under uncertainty], and also intuition, and a third element is stories, cases, which you compare. And these are the key elements to deal with uncertainty.
[00:06:06] Christine Ko: You mainly use the word " heuristic".
[00:06:09] Gerd Gigerenzer: Yeah.
[00:06:10] Christine Ko: Do you think cognitive bias is the same as heuristics? People use them interchangeably?
[00:06:17] Gerd Gigerenzer: No. No. [Okay.] Mainstream behavioral economics talks about heuristics and biases. This is not my view. Most of mainstream economics studies simple problems where everything is known. It's like roulette. That's not uncertainty, and you don't need heuristics. The confusion comes in because these colleagues tend to generalize into the real world and not make the distinction between risk and uncertainty.
[00:06:46] Christine Ko: I think that it becomes a semantic issue, but it causes confusion because, for example, even availability bias, I also see it written as availability heuristic.
[00:06:56] Gerd Gigerenzer: Yeah. This is the Kahneman and Tversky tradition. That's not my research. The economic view is there is a Homo economicus, which is based on abstract, probabilistic reasoning, and there is always an optimal answer. That doesn't help you very much as a doctor. Cause if I tell you just find the optimal solution. Huh? What do you do then? The Kahneman Tversky School of Behavioral Economics, and I emphasize that because there are other behavioral economists, like myself; they take the norm Homo economicus even more seriously than economists and try to show that human judgment deviates. And that's then the cognitive biases story. But they need to assume that there's an optimal answer that Homo economicus would find. And just we are a little bit too stupid for that. That's not my view. Under uncertainty, Homo economicus is mute, as there is no way to optimize. The best thing you can do is to use heuristics, or intuition, or narratives. And then the only question, is that the right heuristic? Is that the right narrative? That's the real question. And the entire story of this cognitive biases requires an Homo economicus and omniscient God. And last remark on the negative connotation of heuristics that I do not share. It is new. It started in 1970s. But look, Einstein put the term heuristic in the title of his 1905 Nobel winning paper that wasn't about biases. He thought it's a method to move on. Einstein once said, the intuitive spirit is a gift, and the analytic spirit is its servant. And we have created a society that honors the servant and has forgotten about the gift. That's Einstein. He attributed much of his incredible innovation to his intuition.
[00:09:07] Christine Ko: So to you, cognitive bias is different than heuristic.
[00:09:11] Gerd Gigerenzer: Yes, of course. Heuristic is something that tells you what to do.
[00:09:16] Christine Ko: Yeah. I had read or heard you say that tools to address uncertainty include heuristics, intuition, adopting narratives, but you also included finding people to trust.
[00:09:28] Gerd Gigerenzer: Yeah. That's a kind of heuristic. If you have no idea what to do, you need to exploit the fact that we are social beings and find out whom to trust. And then you may imitate. These are social heuristics. That's very common, that you're not a specialist in a certain topic. You ask someone whom you can trust. And imitation is the basis of our culture. If we wouldn't imitate, we would've to learn everything from new. So the studies that compare, say chimpanzees with human toddlers, they show that children imitate much more precisely and much more generally than chimpanzees. That's the way to learn culture and also much of training. So if you're in medical school just sitting in the lecture hall, you're not imitating. But if you watch a surgeon to learn, for instance, you begin to imitate. And that's an important skill, right? So that's a social heuristic.
[00:10:39] With every heuristic, you need to make a judgment whether it's the right heuristic for this problem or even whom to imitate. Here, trust comes in and experience.
[00:10:52] Christine Ko: Yeah. And making mistakes.
[00:10:54] Gerd Gigerenzer: Yeah, learning from mistakes, but the smarter way is learning from the mistakes that others make.
[00:11:01] Christine Ko: Yes, if possible. So do you distinguish social heuristic from other heuristics?
[00:11:08] Gerd Gigerenzer: Oh, yes. [ You do.] Most of the heuristics can be used for problems that we wouldn't call social, like to invest your money, and also for problems that we would call social. For instance, a heuristic that's called invest equally is a very successful heuristic in investing in stocks. You have a number of n stocks and then you could do either a complicated financial weighing of the things, or you could just do what's called one over n. That means invest equally. If n is 2, 50, 50. The same heuristic can be used by parents who have two or more children and ask, how should I distribute my time to the children? So every heuristic I know can be used for non-social and social, but there is a special group of heuristics that are social, like imitation, that only works with other beings, or advice taking. And they're very important.
[00:12:13] Christine Ko: That kind of social and thinking about other people makes me think about emotions and emotional intelligence. What's your take on emotional intelligence and is that related to social heuristics in your mind?
[00:12:28] Gerd Gigerenzer: Yes. I prefer the term social heuristics because they tell you what to do. Social intelligence or emotional IQ is often linked to a certain program, which runs like IQ tests. You get a test and then you calculate your social or emotional IQ. But what do you do then? Heuristics are much more useful.
[00:12:50] Christine Ko: I have been just really fascinated with reading your work, trying to understand more about cognitive bias and heuristics because of the uncertainty that I feel in diagnosis and being a doctor, but also being a parent, being a person. A lot of what your work says applies to the diagnostic process and to what happens in a clinic encounter... I'm thinking yes, it must and does apply to life as well.
[00:13:26] Gerd Gigerenzer: Most of real life is uncertain. And even those, for instance, those Nobel Prize winning economists who make complicated optimization models do not necessarily follow these models in their own life. A good example is Harry Markowitz, who got his Nobel Memorial Prize for the question, How to invest in N Assets. And he had a model, the mean variance portfolio, that won him the Nobel Prize. But when Harry Markowitz made his own investments for the time of retirements, did he use his Nobel Prize winning optimization method? We might think he did. No, he didn't. He used a simple heuristic, the one I explained before, 1 over n. So you have n number of assets, say two, you do 50 50. That's it. You don't need any data. While with the typical finance models, and much of finance is built up on the Markovitz type of models, you need years of data to estimate the rates of your models. Particularly in a world of finance, which is highly uncertain, we have seen that the big crisis a few years before, eminent economists in finance asserted us that everything is fine; the models have it in control.
[00:14:54] Heuristics can be used consciously, like in investment, but also unconsciously. And this use, the unconscious use of heuristic is called intuition. So you feel something that you shouldn't do that, or you should do this, but you can't explain it. It's unconscious, the reasoning. There is a large literature in behavioral economics who mostly tries to talk bad about intuition and to do experiments that people do all kind of cognitive errors. This is highly misleading, and most of the experiments are all calculation experiments; it means situation of risk. In the real world it's different. For instance, an experienced physician can sense in the blink of an eye when something is wrong with a patient, without being able to articulate why. Or chess masters report that their intuitive play is the secret to success. Intuition requires years of experience. And only then you have good intuition about a certain topic. It's a form of unconscious intelligence.
[00:16:01] I so enjoyed speaking with Dr. Gigerenzer. I am going to stop here and continue the conversation in a part two in order to continue covering how to optimize judgments under uncertainty.