30 Comments
May 29, 2021Liked by Noah Smith

In addition to federal research funds stagnating, the administrative overhead of doing science is also increasing significantly. A standard NSF proposal has about 15 pages of science + 100 pages of mandated documents, including conflict of interest spreadsheets, facilities lists, data management plans, postdoc mentoring plans, and so on. Every proposal has to include all this information even though less than 20% get funded. It can take at least 6 months and often over a year to get a decision on a proposal, so one has to be sending them out fairly frequently to keep up a funding stream. There is also a greater emphasis on partnerships and big collaborations to get research funding, but large collaborations have a much greater communication overhead. There are more administrative mandates that devour scientists’ time with limited positive feedback coming out of it.

Expand full comment

The metric for how many ideas we're finding, for example, counting Nobel Prizes, is hopelessly squishy. Some are given to scientists who have made many contributions, but the committee has to select just one (Einstein for the photoelectric effect -- they picked the work easiest to describe). And the impact of, say, the first observation of slightly higher superconducting temperatures, is many orders of magnitude less the identification of the structure of DNA or the observation of gravity waves.

The "Moore's Law" analyses of the exponential rate of growth of the capability of some particular technology are indeed the result overlapping curves in which each advance in the technology saturates. And the overall timescale is really the period of a cycle which passes through innovation, development of manufacturing capability, development of a market for the products, all feeding back to stimulate new innovation. Take batteries. If you plot watt-hours per kg from 1900 (Volta) to 2000 (Li-Ion), you find a doubling time of about ten years. That takes us from telegraph repeaters through flashlights to hand held power tools and leaf blowers. Opening of markets for cellphones and electric vehicles has almost certainly changed the slope of the curve by increasing the market pull, and introducing production facilities so expensive that much more R&D falls out of the investments, and the time permitted for recovering the cost has to shorten.

So I think that asking about the rate of science without looking for the obstacles to complete the cycle of adoption may miss the actual limiting steps.

Expand full comment

Anton Howes (innovation historian) talks about this on a recent podcast. But also we do need to keep innovating. https://www.thendobetter.com/arts/2021/5/21/anton-howes-on-innovation-histroy-the-improving-mindset-and-progress-studies-podcast

Expand full comment
May 29, 2021Liked by Noah Smith

Did you read David Roberts' post yesterday about new economic modeling on grid development?

https://www.volts.wtf/p/rooftop-solar-and-home-batteries

Big evidence on the techno-optimism side. Some of the most serious people in the business of forecasting for utilities think that it's actually _cheaper_, over the next fifty years, to de-carbonize the economy with a mass rollout of distributed energy, than to continue "business as usual" with maintaining and expanding the transmission grid. And that's completely leaving aside the health / environment concerns. ("What exactly is the business case for immolating your customers?")

Once you have mass-scale solar and storage, you can systematically shave down the peak level of load -- the worst case scenario that you hit maybe two or three days a year. The _whole grid_ has to be built so it won't fall apart under that peak load. Knocking that down by 10-20% _drastically_ lowers the cost of the transmission grid and distribution lines. That cost savings _way_ more than offsets the cost of the distributed resource investments. Like, by at least hundreds of billions of dollars.

Expand full comment

Such a nice piece. Thanks.

Imho, there are three aspects. More people (researchers?) are actually involved in duplication efforts rather than adding to NEW knowledge pool/outcomes. The Wikipedia model of crowd collaboration is mostly crowd duplication.

The second point is about seeding of NEW S- curves. Yes, whatever the discovery, it will have a plateau in future. The history of science shows sharp growth (middle part of S curve) when more seeding (new ideas) happens. My fear, may be controversial, is whether over dependence on mechanization is slowing down seeding, after all a new idea first comes from Thought Experiments. Photo Electric Effect was discovered by Einstein first through Thought Experiments. Right? Seeking CERN proof of new particles at bottom part of S won't help us; the best example being Higgs Boson.

The third point a little more controversial, actually I don't know much, but just wonder. Whether intelligence/creativity of homo sapiens at top of the pyramid is slowing down (S curve ???)? Let's accept one truth. The growth of science has always been directly proportional to a handful of path finders. Whether Galileo, Newton, Darwin or Einstein, only a few super geniuses seeded the later explosive growth. The situation is unlike business leaders; it's more akin to super geniuses in fine arts like painting, music and dance. Though number of dancers have increased exponentially, only a few new dance forms have been created. I haven't heard any painter better than Picasso. Has mechanization made our mind less innovative/IQ? If so, what effects AI & Robotics will have on future growth of science?

A few random thoughts.

Expand full comment

I'm staking out the middle ground in this debate.

On one hand the "Great Stagnation" has never been more than a myth: technology has continued to advance rapidly throughout our lifetimes. The real underlying complaint is that since the 1980s first-worlders have seen investment and growth shift to other parts of the world.

On the other hand we have made very little progress on truly renewable energy, and we are right now in the midst of a dire collective-action-failure crisis with our still growing fossil fuel consumption and greenhouse gas emissions and no serious plan to stop that growth. Last summer's wildfires should have made clear to everyone that we have already exhausted Earth's ability to sustain our behavior without violent reactions.

It's not only techno-optimists who seem to want to wish away the very serious problems of limited energy supplies and side-effects of consumption. Most sci-fi assumes that energy in the future will somehow become free and unlimited, even if in pessimistic sci-fi it's typically hoarded by bad guys. For example here's a beautifully grim new vid mixing animation and live action to imagine a Bladerunnerish decaying city in which good guys eke out sustenance from urban farming while bad guys hover around in hovercrafts. https://www.youtube.com/watch?v=LsGZ_2RuJ2A

One area where you definitely do need more rigor is to support your statements on solar power and batteries. Your evidence seemed to consist only of lower panel prices relative to the past and the growing quantity of installed panels. But solar is still very expensive relative to other power sources, and the growth is entirely driven by state support not markets. Producing solar panels is energy intensive, and much of the falling costs owes to Chinese state support through megadams, or cheap brown coal. Do the numbers: compare the energy costs of production to the expected energy output over the panel's lifetime, depending on where it's located. The picture is still quite grim. The solar industry still consumes far more energy than it produces. Solar subsidies (including requirements that distributors purchase solar power at regulated prices) make the problem worse, mainly by driving installations in gray-sky areas to which solar is especially poorly suited.

Batteries are of course also energy intensive products: if you compare solar alone to solar+ battery, solar + battery shifts the time of power output to when its needed, but at a very high price in terms of the system's ratio of energy output to energy consumption. In terms of minimizing greenhouse gas output, gas turbines alone beat solar + gas turbines, which beat solar + batteries (and solar alone is not an option).

Expand full comment

Basically all actual scientific work these days is done by people who are ostensibly trainees -- graduate students and postdocs. Professors only have time to write grants and manage.

Since the system is set up this way, that means if you want to pump money into scientific research, that money has to be used to hire graduate students and postdocs to do the work. This means, for instance, that you can't really get one person to work on a project for more than a couple years, as they eventually have to either graduate or leave their postdoc position. And since if a project doesn't pan out, it may mean a student doesn't get their degree, or a postdoc has to leave their career forever, there's a big incentive to try to "hit singles".

This is, a priori, a very strange way of doing things. I wonder how much this has changed over the course of the 20th and 21st centuries, what kinds of effects it has had, whether other countries (Japan?) don't use trainees to do all work, and what would happen if we could set up institutions to fund permanent "individual contributor" roles for experienced scientists to do hands-on research.

(Incidentally, this structure seems comparable to certain other fields which also require a lot of education and are prestige-obsessed. Consulting, big law, and investment banking are all structured similarly. The actual work is done by a lot of student interns and overworked junior employees. After a few years you either get promoted into management or leave. I wonder whether this is coincidence or not...)

Expand full comment

The metrics for measuring costs are a bit off when it comes to the historical side.

When Galileo was doing his thing, books were still SUPER expensive - like a car, or large appliance. They cost several months' worth of a day-laborer's wages, and only the rich could afford them. The precision optics for his telescopes were cutting-edge, and probably cost a pretty penny as well - AND he had to build the things himself, because he'd only just recently read about them being invented elsewhere. Same goes for the precision glass-blowing necessary to build his "thermoscope", and the precision metalworking for all those pendulums he studied. It's not like everyone just had a clock.

When Newton was writing Principia, books were somewhat cheaper, but he still would have had to fund his personal library all on his own. And his research was an idle vanity project, a distraction from productive efforts like managing his affairs.

Science was an elite endeavour back then because only elites could afford it. Pisans were probably HORRIFIED at Galileo's wasting a whole damned sack of potatoes by tossing them off the tower.

Anyways, I'd love to see an inflation-adjusted per-capita of actual science spending going as far back as possible. I'd suspect that it's increased only relatively gradually over time. The headline numbers are probably just only because we've got more scientists.

Finally, I'd also point out that "we've got more scientists" and "the low-hanging fruit has all been picked" means we've got more avenues to pursue. Two hundred years ago, Smith was lucky to be able to piece together supply and demand from early private and public data. Today, economists can spend a whole career studying the price elasticity of a single good, or collecting better labor statistics, or working out game theory. Two hundred years ago, early doctors had to rob graves. Fifty years later, they were still working out germ theory. Fifty years after that, penicillin had yet to be invented. Today, a researcher can spend a whole career studying a single antibiotic, or hunting for new ones, or teasing out the genetic complexities of an antibiotic-resistant bacterium.

If progress is slowing, it's because the amazing progress we've made has opened up entire universes of new questions, not just that those questions are harder. My Master's thesis was pretty simple - "low-hanging", if you will. I just built a device that would help a patient keep their foot muscle steadily flexed during a minutes-long MRI scan. It wasn't even that great at its job - the stupid pneumatics I used kept losing pressure under the flexing. But there's no reason to build it unless you have an MRI machine, and you know what diabetes is as a disease, and you know that diabetic neuropathy (of the foot) is a symptom that you even could or would want to diagnose and manage. The point is, every new scientific discovery poses a hundred new questions. Even with a vastly more-educated and larger population than ever, I think we're just kind of losing the battle with the exponential nature of available things to research.

Expand full comment

Think about how many scientists in previously underdeveloped countries we can now discover. Think of all the women left out of science in the 1920s. Think of all the high school students doing genetic assays for science fair. The underlying model may be correct, but there's huge untapped potential out there.

Expand full comment

The other problem I have with the conclusion that science is slowing down is that maybe science just isn’t very parallelizable. So throwing more people at one problem ie building smaller chips. Doesn’t speed things up much as they all just end up working on similar problems as each other and reproducing each other’s work. So of course throwing more people at one problem will make it appear like diminishing returns. But if you might actually get a very similar growth curve with less people just because it takes a certain amount of time to learn what the problems are and come up with a way to fix it and having 10 people do that independently doesn’t speed it up. In computer science most problems take a lot of effort to make them parallelize well. It’s not clear that we have put that effort in at all with scientific discovery. Particularly something like chip manufacturing where it is just many companies competing with each other and not communicating

Expand full comment

Is there any research that explains where federal R&D spending is going instead? I imagine a lot of it's going to entitlements. In the 2000s a lot went into wars, though I suppose the sequester curtailed some of that.

Expand full comment

I'm in over my head, but that has never stopped me before.

Thought experiment: put yourself in the place of the person who, two or three millennia ago, dreamed up the mechanical helix, the waterwheel, mechanisms powered by steam, an air pump, water organ, a chain drive (in a repeating crossbow), the gimbal, etc. The guy was sure he'd become a household name and get as rich as Croesus, but ... nope.

Humanity didn't get a true steam engine until the 18th century. The entire Roman empire might have been chock-full of watermills but wasn't. So my question, with the caveat that this is the first that I've become aware of this mega-debate on stagnation, is this: how much does it matter?

That is, IF social and cultural forces (which I take to have been the reasons why the Greeks and Romans were so much better at generating ideas than capitalizing on and implementing them) are actually far more powerful and influential than a great insight or clever invention, are people asking the right questions in this stagnation debate?

It's not THAT hard to imagine a point in the future when only a significant minority of people will be willing to get vaccinated against a virus like Covid-19. Social and cultural forces! Maybe the danger is not so much that we're running out of ideas or that science is getting too expensive and cumbersome to do well. Maybe it's that everything NOT-science is failing to keep up, even regressing in some areas. The surge of religious fundamentalism in the 20th century is a prime example, but not the only one.

Running out of good ideas? No problem. Let's actually try teaching evolution in Tennessee or Arkansas high schools and see what that does for us.

Expand full comment

One aspect of rising cost that I have always wondered about is whether getting each generation of new scientists to the forefront of the field gets progressively more expsensive over time? We are not only standing on the sholders of giants but on the shoulders of giants themselve standaing on the shoulders of giants, etc. It takes a lot of giant climbining to reach the frontier. Educational technology may then be important in improving productivity of science back to historical trend.

Expand full comment

Perhaps a simpler question to answer: is mathematics slowing down? I don't see any particular sign of that. Mathematics is not big-science, you don't get more results by throwing big research teams at a problem. It's still mostly individuals and small collaborations, and the frontier is limited by what one person can comprehend in a lifetime. There are still vast frontiers to explore, and some are quite accessible to laypersons. The modern age adds computers the mix, and faster computers might make some types of exploration easier, but the end goal is always about human-understandable results (and sometimes human-useful results).

Expand full comment

Even if fewer discoveries are being made, there will still be those who apply those new discoveries in different ways.

Expand full comment

Quick note: it's "et al.", not "et all."

Expand full comment