If the Rationalist movement can be said to have a leader, I would argue that it is Julia Galef. She hosts the podcast Rationally Speaking, and is the founder of the Center for Applied Rationality, which tries to train people to eliminate cognitive bias. And now she has written a book! It’s called The Scout Mindset: Why Some People See Things Clearly and Others Don't, and you can buy it on Amazon here.
In the interview that follows, I talk to Julia about different concepts of rationality, about the purpose of the “scout mindset”, about whether rationality will win in the marketplace of ideas, and more!
N.S.: So I hear you have a new book! It's called "The Scout Mindset: Why Some People See Things Clearly and Others Don't". I'm going to read it, but why don't you give me a preview of what it's about!
J.G.: I do! It's about, unsurprisingly, the scout mindset -- which is my term for the motivation to see things as they are, not as you wish they were. In other words, trying to be intellectually honest, objective, and curious about what's actually true.
The central metaphor in the book is that we are often in soldier mindset, my term for the motivation to defend your own beliefs against arguments and evidence that might threaten them. Scout mindset is an alternative way of thinking. A scout's goal is not to attack or defend, but to go out and form an accurate map of what's really there.
So in the book, I discuss why soldier mindset is so often our default and make the case for why we'd be better off shifting towards the scout instead. And I share some tips for how to do that, which I illustrate with lots of real examples of people demonstrating scout mindset, in science, politics, sports, entrepreneurship, activism, and lots of everyday contexts as well.
N.S.: So are we always better off being scouts instead of soldiers? Just to indulge in a bit of devil's advocacy, don't many entrepreneurs succeed by being overconfident about their idea's chance of success? And doesn't irrational optimism often sustain us through trying times? Isn't excessive realism considered a hallmark of clinical depression?
I guess this is a specific way of asking about the more general question of analytic rationality versus instrumental rationality. Are there times when, if we were a planner trying to maximize our own utility, we would choose to endow ourselves with logical fallacies and incorrect beliefs?
J.G.: Yeah, my claim isn’t that soldier mindset has no benefits. My claim is that:
1. We overestimate those benefits, and
2. There are usually ways to get those benefits without resorting to soldier mindset
I’ll briefly sum up my case for those claims. To the first point, one reason we overestimate soldier mindset’s benefits is that they’re so immediate. When you convince yourself “I didn’t screw up” or “My company is definitely going to succeed,” you feel good right away. The harms don’t come until later, in the form of making you less likely to notice yourself making a similar mistake in the future, or a flaw in your business plan. And just in general, humans tend to over-weight immediate consequences and under-weight delayed consequences.
(As an aside, it’s worth noting that the research claiming things like “People who self-deceive are happier” is really not very good. I’m willing to believe that self-deception can make you happy, at least temporarily, but I wouldn’t believe it as a result of the academic research.
Then to the second point… even though people often claim that you “need” soldier mindset to be happy, or confident, or motivated, there are lots of counterexamples disproving that.
For example, you brought up the claim that entrepreneurs need to be overconfident in their odds of success, in order to motivate themselves. That is a common claim, but in fact, many successful entrepreneurs originally gave themselves rather low odds of success. Jeff Bezos figured he had a 30% shot at success with Amazon, and Elon Musk gave his companies (Tesla and SpaceX) each a 10% chance of success.
Yet obviously both Bezos and Musk are highly motivated despite recognizing the tough odds facing them. That’s because they were motivated not by the promise of a guaranteed win, but by the high expected value of the risk they were taking: The upside of success was huge, and the downside of failure was tolerable. (“If something is important enough, you should try,” Musk has said. “Even if the probable outcome is failure.”)
That kind of thinking about risk is a better source of motivation, I would argue -- because it doesn’t require you to believe false things.
N.S.: Got it! The idea of scout mindset reminds me of my favorite Richard Feynman term: "a satisfactory philosophy of ignorance". Was Feynman's thinking influential to you at all?
Anyway, I have another question, about the relationship between the soldiers and the scouts. In real armies, scouts and soldiers are on the same side, doing different jobs but fighting a common enemy. What is the common enemy in the case of people who take the two mindsets? Or if not an enemy, what is the common purpose that unites them, or ought to unite them?
J.G.: You might be pushing the limits of my metaphor here. It’s true that in real life, scouts and soldiers share a larger goal, but in my metaphor they don’t (I was focused more on the roles of scout and soldier, not their shared goal).
Indeed, there are multiple different goals for which scout mindset might be put to use, such as “Helping you make good decisions” or just “Satisfying your curiosity for its own sake.” And likewise there are multiple different goals for which soldier mindset might be put to use, such as “Protecting your ego” or “Staving off negative emotions.” (Although as I mentioned, I think there are usually ways to achieve those goals without using soldier mindset!)
And that’s a great piece by Feynman. You’re correct that he’s been an inspiration to me. I love his intellectual curiosity, and I often borrow his quote, “The first thing is that you must not fool yourself – and you are the easiest person to fool.”
That quote is from a speech Feynman did called Cargo Cult Science, in which he describes how the essential ingredient of doing science is being willing to “lean over backwards” to avoid fooling yourself. For example, pre-committing to publish your results no matter which way they turn out. Or recording a prediction ahead of time, so that you can’t later convince yourself “I always knew this would happen.”
Or another example, which I describe in the book, is what’s sometimes called “blind data analysis.” You automatically alter your data by some amount, and then carry out your analysis on that altered data, without knowing what the real data is. Only once you’re finished with the analysis do you get to “un-blind” your data and see what results your methods yield on the real stuff. So you can’t, consciously or unconsciously, put a finger on the scale by choosing methods that just-so-happen to yield the results you want.
N.S.: Yeah. It strikes me that what you're describing here is really a scientific method. It's not quite the same as the classic version we learn in grade school, which is really suited to the natural sciences and to lab experiments. Instead, what you're describing is the new scientific method that we're working out for dealing with empirical sciences -- things like economics and sociology where you can't necessarily put things in a lab and discover universal laws of nature. As empirical research has come to dominate many disciplines, we're discovering all new ways to fool ourselves -- p-hacking, specification search, overfitting, and so on. The analytical tools you're describing are some of the methods we're developing to guard against that.
But what's even more interesting to me is that you're proposing we take these same methods and use them in our own lives, outside of science. Which is not to say that we should treat our lives like research, but rather that many of the same analytical approaches that help us avoid research mistakes can also help us avoid mistakes in our lives. Would you say that's an accurate characterization?
And if that's accurate, could you give my readers a little preview? What are one or two life situations where the mental tools of the Scout Mindset lead to better decision-making?
J.G.: Yeah, that’s an apt comparison. The reckoning that’s been happening in the empirical sciences in the last 10-15 years is really about how to make the process of science more scout-like, even if individual scientists -- who are, after all, human beings -- will always be part soldier.
The one thing I’d change in your otherwise-excellent description is that “analytical approaches” aren’t that central to what I’m proposing in the book. In our everyday lives, I see more promise in solutions that focus on increasing self-awareness, or changing how you feel about being wrong. Or changing the community you’re embedded in, so that you’re surrounded by people who value scout mindset more than soldier mindset.
I’ll give a couple examples. In my section on self-awareness, I describe some thought experiments that can help you look at an issue with clearer eyes. One is the conformity test: When you find yourself agreeing with someone else’s opinion (or with a group consensus), imagine that they told you, “Actually, I don’t believe that after all. I was just playing devil’s advocate.” Would your own opinion shift too? Or would you still hold the opinion independently of them?
I also talk a lot about ways to make yourself more open to considering inconvenient truths. For example, when I’m in an argument and I start to suspect I might be wrong, it’s very tempting to dismiss that thought and instead reach for ways to defend myself. I find it easier to resist that urge when I remind myself of some “silver lining” of being wrong. For example, sometimes I remind myself, “If I concede this point, that gives me more credibility in general, because I’ve shown that I’m willing to concede points. So it’s like I’m investing in my future ability to be convincing.” That doesn't make the prospect of being wrong *appealing*, per se, but it does make it tolerable enough that I'm willing to face it.
N.S.: Got it. So here's a question. In the ecosystem of ideas, will the Soldiers tend to win out over the Scouts? William Butler Yeats, who was my avatar picture on Twitter for years, wrote the famous line: "The best lack all conviction, while the worst are full of passionate intensity." Is there any reason to think this won't be the way it regularly plays out in our society with Soldiers and Scouts? Will we always end up having our noosphere ruled by the people who unthinkingly adopt motivated reasoning, while the honest and thoughtful people sit around debating things against themselves in their minds?
J.G.: It depends on the audience. A lot of audiences aren’t that concerned with what’s true, and are mostly just looking to hear claims that are entertaining or validate their own preconceptions. For those audiences, yes, I think you have an advantage if you’re a soldier, because that allows you to optimize for what your audience wants without being constrained by truth.
However, if the audience you’re trying to reach is more discerning, or if they’re a priori somewhat skeptical of your position, then I think scout mindset helps a lot, because it allows you to recognize the flaws and weak points in your own thesis. And that helps you persuade in two ways: First, because it allows you to revise your thesis so that it’s stronger, and therefore more compelling on its own merits. And second, because it allows you to be honest about the weak points – which, for an audience that’s predisposed to skepticism, actually builds trust and makes them more open to your argument.
I’ll give you an example. When Charles Darwin published On the Origin of Species in 1859, making the case for evolution by natural selection, he was met with a ton of skepticism and pushback from the scientific community. Nevertheless, within 10 years, a majority of scientists had been won over to Darwin’s view of evolution.
Why? Because Darwin had spent years obsessing over the weak points in his thesis, the facts that didn’t quite fit (like the existence of peacock feathers, or sterile ants, both of which seemed to contradict the theory of natural selection). As a result, by the time he published, he had already revised his thesis to account for those facts. Or, failing that, he would acknowledge the weak points openly in his book, and say something along the lines of “Here’s my guess about an explanation for this, although I can’t prove it…” which his readers found disarming.
So essentially, any objection people could come up with, he had already recognized and taken into account by stress-testing his own theory within an inch of its life.
N.S.: So here's a question. The other day you talked to Matt Yglesias about getting the Iraq War wrong. I remember that's something you and I had talked about as well. Are there any other situations where some Scout Mindset would have helped you make a better decisions or support a better public policy?
J.G.: Well, it's rare for me to form a confident opinion about policy, just because it takes me so much time to investigate a topic to a point where I can have justified confidence in a position. But something close to what you asked might be the way I was thinking about Covid risk in February 2020.
At the time, it seemed so unlikely to me that Covid would become a huge, world-disrupting pandemic, even though a bunch of smart people I'm friends with were concerned about that outcome. I didn't have any great reason for my skepticism. In retrospect, I think it was just the result of a knee-jerk normalcy bias -- the tendency to assume that things will keep going in the same way they always have in the past, even in the face of compelling evidence to the contrary. If I had bothered to press myself on that intuition, I think there's a good chance I would have realized I couldn't defend it.
My failure at predicting the pandemic fortunately didn't have any real-world repercussions. But I still like to scrutinize things I got wrong, and ask why I got them wrong, so that I have a better sense of the kinds of errors I'm prone to. I'll be a little more alert to normalcy bias in myself from now on.
N.S.: Ha! I made the same mistake. Against stupidity the gods themselves contend in vain, as Friedrich Schiller said.
OK, last question. Now that you've written this book, what's next on your plate? What are you up to these days?
J.G.: One thing I'm up to is traveling around the country with my fiance, spending one month at a time in different Airbnbs. And once Americans are allowed to travel to more countries we hope to continue our nomadic lifestyle farther afield.
But I have also been thinking about my next project. One thing I'm really interested in is getting to the bottom of intellectual disagreements. Before COVID hit, I was experimenting with different methods of having productive debates over artificial intelligence risk, philanthropy, housing reform, and other important issues that lots of people I know disagree about. I'd like to get back to that, and hopefully refine some methods for reaching truth as a group.
Great interview. I am looking forward to seeing whether a couple of points are addressed in the book:
1. The relative ease of soldier-mindset solutions vs "healthier" solutions to motivation problems. Like, it's great that Bezos and Musk could convince themselves that their long shots were worth taking, rather than deluding themselves that they had a sure thing. But I think most of us find motivating ourselves to take high-EV low-probability gambles much harder than deluding ourselves about the probabilities. I am influenced here by my experience as a graduate student trying to do mathematical research, which is all about trying a lot proof strategies that probably won't work to find the few that do work, and which I found very dispiriting despite my love of math.
2. Practical strategies for reducing the social desirability bias that so often drives a soldier mindset. I think many people are embedded in communities which they value highly, which are important to their life projects and have a lot of really smart and capable and well-intentioned people in them, and in which there sadly are a bunch of empirical beliefs that you basically cannot publicly question without being ostracized from the community. It's not easy to weigh the social belonging vs truth seeking tradeoffs in that case, and "just change your friend group and coworker group to be all rationalists instead" is easier said than done.
Love this interview. Also love that you’re bringing Galef to a wider audience.