The dream of bringing back Bell Labs
Was America's most famous corporate lab a product of its time, or something that can be reproduced?
I just finished reading The Idea Factory: Bell Labs and the Great Age of American Innovation, by Jon Gertner. It’s a fun and fascinating book chock full of stories about the invention of the vacuum tube, the transistor, information theory, communications satellites, fiber-optic cable, and so on. The personalities really come to life — the pragmatic Mervin Kelly, the arrogant and paranoid William Shockley, the obsessive and curious Claude Shannon, the distractible and creative John Pierce. I highly recommend it.
That said, the book is also a bit frustrating, because it grasps at a general theory of collective innovation, but never quite reaches a conclusion. How much of breakthrough discovery is due to the team vs. the individual? How do incentives and organizational structure matter? How much of the success of Bell Labs was due to the luck of history versus the unique way the organization was set up? The scientists at Bell themselves grappled with these questions, but nobody has ever answered them conclusively.
The way America innovates has certainly changed since the mid 20th century. In a 2019 paper, the economists Ashish Arora, Sharon Belenzon, Andrea Patacconi and Jungkyu Suh document two major trends: the rise of university research, and the decline of corporate labs. Basically, businesses still spend a ton on R&D, but more and more of what they spend goes to development (getting innovations ready for market) than research itself.
Bell Labs is one of the iconic corporate labs, along with Xerox PARC, DuPont’s labs, a few others, whose decline illustrates the general trend.
The authors document two main reasons for the change. First, the government started supporting universities a lot, both with money, and with legal changes (especially the Bayh-Dole Act) that allowed universities to profit from licensing their research to private companies. Second, the maturation of the startup and venture capital ecosystem, increased competition, and a more tolerant government attitude towards acquisitions all pushed big companies to buy innovations rather than producing them in-house. Note the ambiguous effect of antitrust here — stronger antitrust promotes competition (as when Bell was broken up in 1982), while weaker antitrust lets companies substitute M&A for research.
The net result of these changes was to create a sort of supply chain for innovation — universities do the scientific research, startups and university spinoffs and mission-oriented government agencies like DARPA turn insights into inventions, and big companies acquire the inventions and focus on turning them into products. Here’s how the authors sum it up:
In summary, the new innovation ecosystem exhibits a deepening division of labor between universities that specialize in basic research, small start-ups converting promising new findings into inventions, and larger, more established firms specializing in product development and commercialization…Large firms therefore invest in scientific capability not so much to generate knowledge as to be effective buyers of knowledge.
From the standpoint of economic theory, this sort of supply chain makes a lot of sense. Research discoveries and fundamental inventions are public goods, because they have a strong tendency to leak out across organizational boundaries, making it impossible for whoever did the research to capture the monetary value. The transistor was invented (discovered?) at Bell Labs, by John Bardeen, Walter Brattain, and William Shockley; how much of the total monetary value created by transistors was captured by those individuals, their research teams, or Bell itself? Very, very little of it. This means that private companies have very little monetary incentive to do basic research, which is why economists generally think basic research should be left to universities and the government (or, depending on which economists you ask, to quixotic rich people).
Arora et al. sound a note of caution, though. They point out several attractive features of big corporate labs that the new innovation supply chain might lack. First of all, while many university researchers focus on questions of general scientific interest (how the Universe was formed, for instance), corporate labs tended to focus on the creation of general-purpose technologies — things that had lots of pragmatic industrial uses, like transistors. The inflationary theory of cosmology and the invention of the transistor are both “basic research”, but their economic value is very very different. Second, corporate labs tend to be multi-disciplinary, pulling in researchers from a wide variety of fields and having them work together on projects, while university research — as anyone who has worked in academia knows all too well — tends to be very siloed by discipline.
For these reasons, universities may not be producing the research that would have the biggest economic benefit. Now, you might not care about this — after all, learning where the Universe came from is pretty cool! But at some point we have to face the fact that we’re spending more and more on research but getting only about the same amount of total economic benefit for all that spending:
At some point we should probably think about increasing the economic productivity of scientific research. The successes of Bell Labs and other big corporate labs in the mid 20th century has many people thinking that maybe this is an important missing piece of our modern innovation ecosystem.
Not completely missing, though. As Arora et al. point out, Google’s AI divisions have been an important driver of research in the machine learning space — an extremely important frontier. All told, the research output of Google AI, Google Brain, DeepMind, etc. has been truly staggering:
Big private companies (especially IBM) are also very active in quantum computing research. And some “startups” like SpaceX are big enough to do research in-house that pushes the boundaries of general-purpose technology instead of just making a quick buck. So Arora et al. are probably overstating the changes to America’s innovation structure a bit; as their own data shows, businesses still do quite a bit of research, even if not as much as before.
But could we bring back something as amazing as Bell Labs? Many people argue that Bell’s success was due to very unique conditions that we probably won’t see ever again. I’ll outsource this argument to Ilan Gur, the CEO of the UK’s science innovation agency ARIA, who laid it out in an interesting Twitter thread last January:
Gur argues that there’s just too much well-funded and widely distributed competition for research talent and discoveries these days for an organization like Bell Labs to dominate the landscape. He suggests that instead, we should focus on helping startups do deep fundamental innovation, like Moderna and BioNTech did with mRNA vaccines.
It’s a reasonable thesis, and I think there’s lots of value in deep-tech “super-startups” (a term I just invented) that put tons of money into trying to do fundamental research into mRNA, reusable rockets, aneutronic fusion, generative AI, solid-state Li-ion batteries, Crispr, and any number of other fundamental technologies. But there’s a natural limitation here, which is the whim of the market itself. We just saw a massive decline in tech stock prices. This led to a big crunch in startup funding, especially for expensive late-state startups. So far, software companies seem to have taken the brunt of the current crash, while deep-tech companies are doing OK. But given the natural difficulty of capitalizing on fundamental research discoveries (there’s that pesky public goods problem again!), it seems likely that deep-tech super-startups whose efforts will take many years to see fruition will be in the firing line at some point.
So I think there might still be a role for more Bell-style corporate labs in fields where it seems like there’s lots of important breakthrough research left to be done. In The Idea Factory, Gertner suggests two possible such fields: biotech, and energy. Let’s consider the second of these two.
The energy transition is one of the most important events of our time, both because of the need to avert climate catastrophe and because of the promise of ultra-cheap energy. Right now, while startups are trying to do some basic research in energy tech, most of the effort is being done by universities, government labs, or mission-focused government agencies like ARPA-E. But what if there was a Bell Labs for energy tech?
Google can fund AI research with the money from its quasi-monopoly in search advertising. But who would fund a giant high-class world-beating private lab system to do fundamental energy research? There are no big monopoly companies in the energy space — our big energy companies are all fossil fuel companies, and they’re spending their efforts on slowing the energy transition instead of speeding it up.
So here’s a wacky idea: What about a national electrical utility? Many utilities are already government-sanctioned monopolies at the city level, so the principle of natural monopolies in the energy space is well-established. And the intermittency of wind and solar means it might make sense to run new long-distance power lines between cities. So why not turn this task over to a national-level government-sanctioned utility monopoly, that would own the long-distance transmission lines and would also be allowed to invest in local generation capacity?
In fact, Bernie Sanders already suggested creating a government agency to do this, but that’s probably a political non-starter and would have questionable incentives as well. A sanctioned national utility company — call it Energy Bell — with profit margins limited by law, would be motivated by the profit motive — but with capped margins, the only way it could make profits would be to deliver American customers greater volumes of energy. So just as Bell’s driving mission was to connect the whole world, a monopoly energy company’s mission would be to provide more abundant energy to more Americans (with laws to make sure it’s mostly or entirely clean energy).
Meanwhile, the comfy position of a sanctioned monopoly would give Energy Bell the long time horizon to establish its own big corporate research lab, focused on solving deep fundamental scientific problems related to energy — solid-state batteries, batteries that don’t use lithium, aneutronic fusion, seasonal energy storage, solar panel and battery recycling, or whatever. And just to provide a little extra incentive, doing this research could be a condition of Energy Bell’s continued existence, written right into its charter.
And Energy Bell’s large scale could give it the money to hire many of the world’s best researchers in a number of disciplines and put them to work in multidisciplinary, cooperative ways. If we allowed Energy Bell to ignore the cap on employer-sponsored green cards, it would be even easier to assemble this super-team.
Anyway, I realize this is a wacky idea, and very unlikely to happen. We’re in an era of rising distrust of big monopolies, and lots of people in both industry and politics would likely resent a company like Energy Bell. On top of that, there are probably a bunch of technical problems with the idea that I haven’t yet thought of. But if we wanted to bring back something akin to the old Bell Labs, in an incredibly crucial area of fast-moving technology, I think this would be a promising way to do it.
Barring something like an Energy Bell, though, I agree with Ilan Gur that we’re unlikely to see something quite like Bell Labs in our lifetimes. The university-DARPA-startup innovation system is probably here to stay, and we should focus on finding ways to make this technological supply chain work in a more efficient, purposeful, and well-integrated way.
Update: Brad DeLong and I discussed a lot of these ideas in the latest episode of our podcast, Hexapodia.
Interesting analysis but I don't quite agree with it.
First problem - the decline of corporate R&D is over-stated. These graphs are showing relative shares, but all that means is that governments flooded universities with money which then spent it on expanding their quantity of output. Quality, however, is often missing. I've been in the tech industry for a long time and seen a lot of R&D happen, but the reliance on academic research is pretty minimal even in the AI space. Part of my job involved reading academic papers for a few years, but I eventually gave up because the ROI was zero. Lots of papers that made enticing sounding claims but when examined carefully had caveats that made them useless.
Second problem - the distinction between research and development. Very important for academics, not important at all for companies. When I think about the successful big tech projects I've seen, there was no clear line delineating the two. Experiences and results of development informed research, and research was often the same thing as development, just on more speculative features. The same people would do work that smelled like research for six months, then development of those research ideas into a product launch for another six months, then back again, with no change in job title or function. I myself have done such projects in the past.
Thirdly - the assumption that "papers published at academic conferences" is the same thing as output. Very few people in the corporate world care about publishing papers. It's just a distraction. Unless they came from academia and publishing has become a part of their identity, or they want to keep the door opening to returning, papers just aren't going to happen. The only peer reviews you care about are actual reviews by your actual peers, i.e. managers, because that's what determines if your R&D turns into promotions or bonuses. Google AI research is really an odd duck in this respect in that they seem to care a lot about getting published but that's really rare.
Obviously if you just plot by papers published then universities will look very productive and it'll seem like there's a pipeline or supply chain or something here because that's what universities are paid to do and what they optimize for. If you've actually been in real companies doing cutting edge work with tech, like I have, then it all looks like a shell game over the definitions of arbitrary words. Companies invest a ton into "R&D", just not in a way you can neatly sum and plot on a graph. Often they struggle to even measure it for their own purposes!
Finally - a nationalized energy company that can fund research? I know this is a left wing academic blog but come on. That would wreck the energy markets and deliver absolutely nothing in the way of usable research output. The west already dumps way too much money into public/academic energy R&D labs, as well as VCs pumping up private firms, and then Google also funded lots of energy R&D in the past too (see here: https://www.google.org/pdfs/google_brayton_summary.pdf ). There's very little to show for it because doing proper R&D that matters requires experience in the field, so it all gets done by oil companies, wind turbine manufacturers etc.
Excellent but some observations.
The transistor is not basic research like Cosmology. It was based on basic research done a half century earlier by Bohr, Plank and Einstein.
Some misses. The greatest industrial research center in America for 200 years was the Dupont Experimental station. And they very much capitalized!! Nylon. Gunpowder. Lycra. Synthetic fibers like polyester. As well as Agriculture.
Sarnoff labs... TV and Radio development but not the basic research... Marconi et al
GE Schenectady- plastics basic research for wire insulation led to polycarbonate and others. Electrical..well developments not really basic
Still Bell Labs was the premiere research organization of the world.
Bell labs invented fiber optics. In the late 50s. Used as a super top secret sensor deep underwater for detection of submarines. The laser also needed in that scheme they coinvented. But the basic research for both was based on basic optics of Newton and Maxwell, and Einstein.