“Science is the belief in the ignorance of experts.” — Richard Feynman
“If I have seen further, it is by standing on the shoulders of giants.” — Isaac Newton
The other day, my friend Derek Thompson — who first got me into opinion writing — tweeted the following tweets:
This seems like a rather odd time to complain about something like this. Even as we speak, many of those conservative Americans Derek talks about are getting sick with a pandemic plague because they ignored scientific experts’ advice to get a safe and effective vaccine:
I consider myself a skeptical fellow, and that includes skepticism of public health pronouncements. When experts told us that masks were ineffective at the beginning of the pandemic, I never believed them for a second. I was also always partial to the lab leak theory. But when I saw the torrent of studies showing that the Covid vaccines were safe and effective, I believed them. I got the vaccine, and now, unlike the people who distrusted the experts, I don’t have to fear going to the hospital or dying from the Delta variant now sweeping through the country. And in fact, our whole nation seems to have been much more robust against infectious diseases when we had more automatic, unquestioning trust in the vaccine experts:
So the question of when to trust scientists vs. when to believe in the ignorance of experts is actually a subtle and difficult one. There’s got to be some middle ground between disbelieving every vaccine and unthinkingly obeying anyone in a lab coat.
Scientists trust scientists
Richard Feynman did say that science is the belief in the ignorance of experts. But are you going to take his word on that? If you read Feynman’s autobiographies, you’ll see that he spent a lot of time trying to reinvent the wheel on a lot of problems — he was an amazing scientist, but maybe he could have gotten more done if he had spent more time building on other people’s work. (Meanwhile, when he had to solve the Challenger disaster, his first move was to go ask the engineers who worked on the rocket which part they thought might have failed!)
When you look at how science actually progresses — or when you do a bit of research yourself — you realize that scientists trust other scientists to an enormous degree. The entire citation system is based on this. The whole scientific enterprise is additive — researchers take other researchers’ conclusions as given 99% of the time, and then riff on those. They’re not out there reinventing the wheel. Occasionally they do a replication, or accidentally discover that someone else was wrong. But this is overwhelmingly less common that simple acceptance of other people’s results. If scientists believed in the ignorance of experts, it would mean believing in the ignorance of their colleagues, and they’d never get anything done.
And as the body of scientific knowledge gets bigger and bigger, the need for scientists to trust each others’ expertise only grows larger.
Of course, this creates all kinds of problems for science. For example, it leaves the field wide open to fraud, which some scientists believe is much more rampant than is widely recognized. It also forces scientists into overspecialization, which might be slowing down the rate of technological progress. But what’s the alternative? Scientists should probably put some more effort into policing fraud and checking each other’s results, but at some point that hits diminishing returns. Any cumulative group project like science will ultimately involve a huge amount of trust.
So we should realize that Feynman was being his usual flamboyant, expansive self when he uttered that quote. Yes, of course science requires the belief that there’s still more to discover, but everyone knows that. The truth is, science itself is mostly standing on the shoulders of giants.
Skepticism vs. crankery
So how should the rest of us think about science, and scientific experts? How much should we believe or not believe? How do we stay skeptical without turning into cranks?
Well, there seem to be some obvious ground rules. First, we shouldn’t let politics interfere with our factual beliefs. I used to be more confident that people could do this, actually. There’s some research showing that when you pay people to get answers right on factual questions about the economy, partisan belief gaps sharply diminish. To me, that always suggested that when push comes to shove, people know the difference between reality and partisan fantasy. Yet here we are, watching the spectacle of millions of Americans exposing themselves to a plague because their politics won’t allow them to believe it’s dangerous, or that vaccines are safe and effective. So much for the triumph of incentives.
(There’s also the question of how our scientific establishment can reduce the actual incidence of political denialism. If conservatives perceive scientists as a liberal monoculture, that will make them more likely to irrationally distrust what the scientists say. But how scientists can make American conservatives comfortable is not entirely clear. In general, convincing American conservatives that you’re not out to destroy them is a very tricky problem, and one that I’ve personally struggled with my entire life. It may simply be that in a polarized country, trust in institutions invariably suffers, and there’s not much you can do about it.)
Another ground rule is to apply the same evidentiary standard toward competing hypotheses. If there are tons of studies supporting the safety and efficacy of vaccines versus one questionable study supporting the use of ivermectin as a Covid treatment, don’t let yourself think that vaccines and ivermectin are equally supported by science.
A third rule is not to feel a sense of ownership over your own hypotheses or theories. Just because you thought of something doesn’t make it any likelier to be right. The same goes for the first theory you heard. Another Richard Feynman quote I like is: “The first principle is that you must not fool yourself and you are the easiest person to fool.”
In fact, there are a huge set of heuristics and principles for how to be skeptical without being a crank. People like Julia Galef have spent a huge amount of time and effort thinking about this problem — essentially, building up expertise in the field of not being an expert about things. You could do worse than to buy and read her book!
But basically there’s no easy, general answer to the problem of how much to trust the experts and how much to trust your own analysis. It’s basically a hard decision theory problem, involving your choice of loss function (i.e. what you care about), your priors on tons of stuff, and many many specific characteristics unique to each situation. Some things that matter include:
Whether the experts have a reason to lie to you
How empirically grounded the scientific field in question is — whether it’s something humans understand thoroughly (like astronomy), or whether it’s still mostly in the theoretical stage (like macroeconomics)
How public the relevant evidence is (i.e., how easy is it for you to dig in and check the evidence if you want to)
How much consensus actually exists among the experts
Those are all going to be very different in different situations. For example, take two recent examples: 1) the Great Recession, and 2) the Covid vaccines.
In the lead-up to the financial crisis, many financial risk models failed, and if more non-experts had questioned these, we might have avoided catastrophe. In the wake of the crash, a number of economists regarded as top experts advised against fiscal stimulus, and all of them turned out to be very very wrong. A few years later, a smaller number of experts warned that quantitative easing would lead to inflation and financial instability, which also turned out to be very very wrong. Trusting these experts would have been a big mistake.
To understand why distrusting these experts was a good idea, we can look at the four factors I listed above:
The experts making risk models indeed had a reason to lie to people — they made money for moving product, and they could move more product by saying it was less risky (whether they were actually lying or just deceiving themselves very skillfully is something I won’t try to adjudicate). Meanwhile, the experts urging fiscal austerity and hard money were generally conservative types who saw stabilization policy as a dangerous precedent for government intervention in the economy; this might not have made them lie, but it certainly colored their views.
Financial risk modeling might seem empirically well-grounded, but a close observation of the models being used would reveal that they were being used for the first time, and that the data backing them up was not rock-solid. Macroeconomics is not an empirically well-grounded science — not because macroeconomists are a bunch of jokers, but because it’s simply very hard to get clean experimental tests of macro theories.
The relevant evidence on financial risk was opaque and hidden. As for macroeconomics, the relevant evidence was generally public but was thin and unreliable (see previous point).
Although the loudest macroeconomic experts were the decorated pro-austerity hard-money people, the bulk of the profession actually supported a more activist government approach. That fact didn’t really come out until too late, unfortunately, which is one reason the econ profession has become more public and open since the crisis. As for financial risk modelers, I’m not sure how divided they were, but see point (1).
Now compare this to the situation with Covid vaccines:
Public health experts might be liberals looking to make the Biden administration look good, but A) they promoted the vaccines heavily while Trump (who oversaw the development of the vaccines and could have claimed credit) was still in office, and B) a botched vaccine campaign would reflect very badly on Biden anyway. So even if you think public health people are a bunch of super-woke libs, there wasn’t much reason for them to lie here. (Masks were a different matter; the fact that experts very publicly worried about shortages of masks for hospital workers, was a huge red flag.)
Vaccination is a pretty empirically well-grounded field; we have a ton of precedent for RCTs and real-world applications, and we have a decent understanding of virology itself. Of course, these vaccines came out especially quickly, and regulatory requirements were reduced, but the basic science was still the same.
The relevant evidence was all out in the open. You could just look at graphs like this:
You could look at the clinical protocols, etc. You could look at papers on PubMed concerning real-world efficacy and safety. Etc.
There were, to my knowledge, no respected experts claiming that vaccines don’t work or are unsafe. That could mean experts are a complete brainwashed conformist monoculture, so it’s not dispositive of course. But the consensus does mean something.
Anyway, we see enormous differences between the two situations here. With the financial crisis and the Great Recession, there was every reason to be skeptical of the pronouncements of experts, both in the realm of risk modeling and in macroeconomic policy directly after the crash. But when it comes to Covid vaccines, there is really very little reason to be skeptical.
Now, “very little” doesn’t mean “no reason”. If you’re skeptical of vaccines, by all means, dig into the (very public) data, look for hints of expert dissent, educate yourself on virology, and so on. I’m sure there are a few people out there who are so honestly, rationally skeptical that they actually did this — after which I’m sure they went and got their shots.
Anyway, you can get better at the skill of being a non-crank skeptic, but it will never be a solved problem, no matter how smart or knowledgeable or rational you are. You don’t just have to live with uncertainty about the actual facts; you have to learn to live with uncertainty about whether you’re actually processing those facts in the most rational manner. That just adds one more layer to the confusion and the risk of this Universe and this life.
A quick comment on the tweet about number theory papers, which in fact noticeably understates the scope of the situation.
-- Actually 100 pages is nowhere near the upper limit.
-- Actually I *wish* I could "deeply understand" some of these papers (even ones in my own subfield!) merely by taking a semester-long course.
-- It's most of math, not just number theory, and it's not just because the papers are long. The theory required to even *state* the results is piled so deep that one cannot reasonably expect to check all of it for oneself.
-- Nevertheless, I'm surprisingly confident that most of it is basically true, even though the literature is probably filled with superficial errors. This is because it's all interconnected and everyone spot-tests bits and pieces of the edifice as they poke around in their tiny domain. A truly catastrophic problem would eventually lead to a contradiction and thus be noticed.
-- While there technically exist the *very, very rare* occasional mathematician who resists admitting error, and subfields where errors are rampant and confidence is low, the extent of uncertainty still operates at many orders of magnitude below what is normal in any other area of science; there is simply no plausible comparison. For this I am very grateful.
Personally I think a pair of Ray-Ban’s would suit Noah better than the blue reflectors when it comes to sunglasses.
And let me join into the self congratulations I also knew that the mask thing wasn’t true. You would not believe how many Facebook arguments I got into people about that.
I think we’re scientist get in trouble is when they conflate science with social policy. My big one would be mask mandates. I’m vaccinated, and I hate masks. Now I have dug into all of the numbers, and I know that if you vaccinated do you have way less chance of getting it and spreading Covid then you would as a non-vaccinated person wearing a mask. Therefore I feel quite comfortable not wearing masks, and not guilty at all for not wearing them.
But inevitably someone will say science says you can still get Covid after you’ve been vaccinated.
Oh anyway, I digress. Vaccinations has just become this weird partisan issue.
So right now I’m working with three other Americans here in Brazil. So I took a survey and case study.
First a general observation. None of us know anyone that has died of COVID. We all know in passing one or two person who was hospitalized and released. Think coworker you see occasionally.
Me. 51. White dude. Swing voter. Trump disliker. But not liberal. Vaccinated. Did a lot of research. Never had COVID.
Engineer: 27. White Democrat. Standard. Not super liberal, but mainstream. Vaccinated. Never had COVID.
Winder 1. 35. Black. Moderate to conservative. Had COVID. Not vaccinated. Fairly knowledgeable. He figures they will tweak the vaccine in future to handle all the variants and once it’s approved will probably get it then. Knows he is somewhat protected. Comfortable with risk.
Winder 2. 35. White. Conservative. Likes Trump. Hates Biden. Had COVID. Not vaccinated. Knows he is somewhat protected. Waiting for FDA approval.
None of them had any conspiracy theories about the vaccinations. None are overly political. Conversations are about chasing Brazilians women instead of Politics. All were basically familiar with the issues. No science deniers.
Quite frankly when I spoke to them, each person seemed pretty rational to me. It was more about risk vs unknown risk.
Anyway, what the rabid vaccinator and mask preachers miss and don’t really comprehend is that even though there is a pandemic…. For the majority of people they don’t see it or experience it. Their love ones aren’t dying. Even if someone’s knows someone who died of COVID it was someone who was very old or sick already. I don’t think policy makers really grasp this. How so much of the doom and gloom comes across as overblown. This is how conservatives can rationalize not getting vaccinated.
Anyway, I am typing this on my phone, so forgive any spelling and grammar mistakes.
Winder 1 just asked for a copy of the email. May update after he reads it.
Also. There is a tweet going around talking about people having cross partisan friendships. I definitely don’t have to worry about that. I love my job and the people I work with. Long hours. Away from home. But working in Energy contributes to society. Diverse coworkers.