Interview: Patrick Collison, co-founder and CEO of Stripe
In addition to being a friend and a Noahpinion subscriber, Patrick Collison is one of the world’s most successful founder-CEOs. Along with his co-founder and brother John, he built online payments company Stripe into a $36 billion behemoth in a decade. (Patrick and John hail from Ireland, continuing the hallowed tradition of Irish immigrants making it big in America.)
But in addition to being an ace businessperson, Patrick is one of the most intellectually curious human beings I’ve ever met. Hanging out with Patrick will typically always involve him asking you to explain ideas in your area of specialization, which he will typically grasp with unusual alacrity; at least once, he threw a party that consisted of people giving him seminar talks on topics of their choice! His biggest intellectual interest is the idea of technological progress — what it means, why it happens, and how to encourage more of it. Among his ideas is the field Progress Studies, an interdisciplinary field that studies the history and determinants of technological advancement and how it feeds into social progress more generally.
In the wide-ranging email interview that follows, I turn the tables and ask Patrick to explain his own ideas to me! We discuss the future of technology, what kinds of innovation humanity needs, whether innovation has slowed down and what to do about it, the proper roles of government and the private sector, how research funding can be reformed, how international competition (including U.S.-China competition) figures into the equation, and much more!
N.S.: So, what are the three things that excite you most about the 2020s?
It's hard to restrict to three! But here are the first that jump to mind:
First, the explosive expansion in access to opportunity facilitated by the internet. Sounds prosaic but I think still underestimated. Several billion people recently immigrated to the world's most vibrant city and the system hasn't yet equilibrated. When you think about how YouTube is accelerating the dissemination of tacit knowledge, or the number of creative outsiders who can now deploy their talents productively, or the number of brilliant 18 year-olds who can now start companies from their bedrooms, or all the instances of improbable scenius that are springing up... in the landscape of the global commons, the internet is nitrogen fertilizer, and we still have a long way to go -- economically, culturally, scientifically, technologically, socially, and everything in between. I challenge anyone to watch this video and not feel optimistic.
Second, progress in biology. I think the 2020s are when we'll finally start to understand what's going on with RNA and neurons. Basically, the prevailing idea has been that connections between neurons are how cognition works. (And that’s what neural networks and deep learning are modeled after.) But it looks increasingly likely that stuff that happens inside the neurons -- and inside the connections -- is an important part of the story. One suggestion is that RNA is actually part of how neurons think and not just an incidental intermediate thing between the genome and proteins. Elsewhere, we're starting to spend more time investigating how the microbiome and the immune system interact with things like cancer and neurodegenerative conditions, and I'm optimistic about how that might yield significantly improved treatments. With Alzheimer's, say, we were stuck for a long time on variants of plaque hypotheses (“this bad stuff accumulates and we have to stop it accumulating”)... it's now getting hard to ignore the fact that the immune system clearly plays a major -- and maybe dominant -- role. Elsewhere, we're plausibly on the cusp of effective dengue, AIDS, and malaria vaccines. That's pretty huge.
Last, energy technology. Batteries (88% cost decline in a decade) and renewables are well-told stories and the second-order effects will be important. (As we banish the internal combustion engine, for example, we'll reap a significant dividend as a result of the reduction in air pollution.) Electric aircraft will probably happen, at least for shorter distances. Solar electricity is asymptoting to near-free, which in turn unlocks other interesting possibilities. (Could we synthesize hydrocarbons via solar powered atmospheric CO2 concentration -- that is, make oil out of air -- and thereby render remaining fossil fuel use-cases carbon neutral?) There are a lot of good ideas for making nuclear energy safer and cheaper. France today gets three quarters of its electricity from nuclear power... getting other countries to follow suit would be transformatively helpful in averting climate change.
There's lots more! New semiconductor technology. Improved ML and everything that that enables. Starlink -- cheap and fast internet everywhere! Earth-to-earth travel via space plus flying cars. The idea of urbanism that doesn't suck seems to be gaining traction. There's a lot of good stuff on the horizon.
N.S.: I like that we're basically optimistic about the same things! Or at least the same technology things; I'd wager I'm probably more optimistic than you about the improving quality of anime.
OK, so about tech progress. You've been heavily involved in the nascent Progress Studies effort, a sort of integrated push to understand progress and figure out how to accelerate it. What are some of the biggest insights that effort has gleaned so far?
On a related note, I've been debating some people who think our technological stagnation is basically inescapable at this point. What do you think about this thesis? Will advancing to a new cheaper source of energy for the first time in a century boost productivity growth to a degree that digital "bits" technologies have not really succeeded in doing? Will we finally be able to get back on the Henry Adams Curve, and will it make a big difference if we do?
Well, Tyler and I published that piece in the summer of 2019, so I think that Progress Studies -- whatever it turns out to be -- is in its larval stage. Most of all, I think the article yielded a term for a set of topics that plenty of others had been pursuing (often in great depth and for a long time) but that couldn't be easily linked under any broad pre-existing conceptual category. ("Innovation economics" probably came closest but is still too specific.) Some early highlights for me include Jose Luis Ricon's survey of the evidence about science funding mechanisms, Matt Clancy's class (along with his excellent Substack -- you should subscribe), and the Works in Progress magazine, which gives a home for writing on these topics more broadly. Much of our intent was to highlight the value of directly addressing problems that are relevant to progress. So, if someone somewhere is doing important and relevant work as a result of reading our article, I'm very happy about that, even if "Progress Studies" is never mentioned.
I emphatically reject the suggestion that technological stagnation is inescapable... with one caveat. The caveat is that "stagnation" may be a slightly too loaded term at this point because it tends to refer to particular measurements pertaining to the 20th century experience in the US -- Robert Gordon and all that. Gordon himself says at the end of Rise and Fall that we won't see twentieth century-level rates of improvement in the future because of demographic headwinds (among other things). So, to the extent that you're talking about particular macroeconomic time series, you suddenly scope in a whole bunch of definitional and deflator and measurement debates and all that. But if we ignore those and focus on the basic phenomena that we really care about: progress in science, advancement in technology, and the effective deployment of both such that broader societal welfare is enhanced -- yes, I would say that I'm certain that we can do very meaningfully better than we are doing today. I'll claim that we could double our rate of progress.
N.S.: Right. Aging, plateauing educational attainment, etc. will weigh on productivity as a whole, which is why I predict only a very modest TFP acceleration this decade even though I'm optimistic about tech itself. But OK, if our rate of tech-driven economic progress will double, that raises the question of why it's been as slow as it is? Were we inventing tech that enriched our lives without showing up in GDP? Were we busy retooling from one sort of innovation to another? Or did we just not happen to hit one of those rich veins of technological ore, as it were?
And also: What do you think are the most important steps for government to take to speed progress along? Will reforming science funding, perhaps along the lines Ricon lays out, make a big difference? I've been calling for a big increase in federal science funding (the Endless Frontier Act being advanced by Ro Khanna and Chuck Schumer). How much do you think we can expect from throwing more money at the problem? After all, federal spending on R&D is about half of what it was in the 70s and 80s, as a percent of GDP.
(Subscribed to Clancy's Substack, by the way. Thanks!)
"Why is growth slowing?" is a pretty big question, and anything I say will mostly just be skimming the approximate contours of explanations that lots of others have sketched out in much more detail. But I'll give a few thoughts.
As a prefatory point, "why has progress been slow?" might be approaching things backwards -- maybe it's better to puzzle over "why is it ever fast?" or “why does it exist at all?”. The vast majority of human societies generate very little meaningful frontier progress most of the time! (As compared to Solow-style catch-up growth.) Lots of people have sketched out reasons as to why the midcentury picture in the US was perhaps anomalously good and I think those stories probably all have truth to them. If I had to offer some hypotheses that tend to get a bit less attention, I'd throw out a few:
(1) What's going on in science? What we call "science" has changed immensely since WWII. For one thing, it’s way bigger. In 1950, the federal government gave about $100 million per year (in 2021 dollars) to universities for the purpose of basic research. That’s now more than $20B per year. If we look at R&D across the economy as a whole (though that's harder to measure reliably), nominal spending has supposedly grown by a similar magnitude. If we look at numbers of researchers, same story: about 10x more research doctorates are awarded every year today than were in 1950. And how we do it has changed. For example, peer review in the modern -- legitimacy-conferring -- sense of the term is a postwar invention and arose more as a result of funding bureaucracies and controversies than any scientific exigency. (Einstein was very offended in 1936 when the editor of Physical Review shared his now-famous EPR paper for an external opinion.) Lots of people have documented (and decried) the increasingly gerontocratic aspects of modern science and the growing age of grant recipients. However you cut it, what we're talking about when we say "science" just isn't close to the thing it was seventy years ago.
A priori, I think it's completely plausible that many of the changes we've made have not been for the best. Most systems get worse in at least certain ways as they scale. The idea that science could have gotten worse in significant ways sometimes sounds strange to people -- like, we’re doing so much more, how could that be bad? -- but I think that misses the many examples of sensitivity of scientific processes to institutions and culture. Swiss nationals have won more than ten times more science Nobels per capita than Italians have. Ten times! And yet they're neighbors, and Italy certainly isn't lacking in scientific tradition -- Fermi, Galileo, the oldest university in Europe, etc. The "how" of science just really matters. At the micro level, when you look at the careers of individual scientists, the same thing is often striking -- if Born hadn't recommended that Rockefeller give Delbrück a fellowship, he might well have dropped out of science... his track record up to that point wasn't very impressive. But if he had, how far would that have set molecular biology back? So, given that science has changed a lot, we should push ourselves to really understand the effects of those changes, and we shouldn’t assume too casually that they’re all good.
(2) Culture. As Ezra Klein recently described in the New York Times, and Marc Dunkelman has written about in his great piece about Penn Station, a particular version of distorted, hypertrophic progressivism that took hold in the 1970s may have had (and still be having!) quite significantly stifling effects. We perhaps shifted from placing emphasis on our collective effectiveness in advancing prosperity and opportunity for people to the perceived fairness that was embodied in whichever particular steps we happened to take. Or, to say that another way, we shifted our focus from sins of omission to sins of commission. Take California, as Ezra does: there is almost endless attention paid to making sure that no single state project has even a tint of impropriety or suboptimality. The result of that cultural shift, however, is that the state as a whole is then often beset with awful results. With this ethos and panoply of strictures, it turns out that California is almost functionally incapable of constructing a high speed rail line connecting its two major metro regions. California has less civilizational capacity than the France of the 70s that built the TGV! (I spoke with Jerry Brown about this a few years ago and he commented that the change over the course of his lifetime on this cultural front was very striking.) California shifted mid century from being the US's fastest-growing state -- 50% population growth between 1950 and 1960 -- to a state that is somehow, improbably, shrinking. This is, obviously, mostly because of the regulations the state's inhabitants put in place that block the housing that's required to support California’s economic success. As a result, California has lost the "technology" of being able to affordably house its inhabitants. In these ways and many others, technology is both advancing rapidly and yet often receding in the state. (Tenet is a movie about time moving backwards and forwards simultaneously… as a result of its policies, California is the Tenet of states.)
And it's not just California. It's hard to look at Germany, where 77% of inhabitants say that their government has done a good job responding to the pandemic, take stock of the COVID vaccination rate (under 5%, compared to more than 22% in the UK and 73% in Israel), and to not raise an eyebrow. So: what's going on in our culture -- do we just not want good things?
(3) Institutions and first mover disadvantage. Mancur Olson was, I think, right in his focus on institutional dynamics and how principal/agent issues and collective action problems seep into our systems over time. For example, the FAA was probably pretty good in its early days... to enable a successful aviation sector, there are lots of public goods to be provided in the domain (clear rules around airspace, air traffic control, navigational aids, airports, ...) and very few would argue that the optimal amount of aviation regulation is none. On the whole, the US did a great job of fostering aviation technology, and the people who went to the FAA in the early days were motivated by the right things -- a desire to support and enable a burgeoning aviation sector. Today, the FAA is a sprawling, stodgy, entrenched bureaucracy. It has about 45,000 employees represented by multiple unions; there are lots of countervailing constituencies with vested interests to protect; there are endless accumulated outdated rules. For example, electric engines are, as of today, prohibited in certified aircraft! Despite the obvious potential drones have for everything from surveying to firefighting, the FAA still prohibits beyond-line-of-sight use. (In stark contrast to the deliberately permissive approach the FAA used during aviation’s infancy.) The avionics you see in cockpits are bafflingly primitive because it's so hard, slow, and expensive to get the FAA to approve new technology. As a result, pretty much every pilot flies with an iPad running sophisticated flight planning software -- their connection to a world that the FAA doesn't encumber. Now, every system suffers from some of this inertial dynamic, no matter what the culture around it is. No nefarious lobby is opposed to electric engines in aircraft, and the FAA will assuredly legalize electric engines at some point... there's just the status quo bias that naturally ensues from "well, we have a working system; that system naturally resists change".
The period of the early twentieth century was an era of building in the broadest sense, from universities to government agencies to cities to highways. The byproduct of this period of building is maintenance and we haven't figured out how to meta-maintain -- that is, how to avoid emergent sclerosis in the stuff we build. I see the exact same thing in financial services, by the way. Nobody in financial services thinks that real-time settlement is a bad idea. Cryptocurrencies show that it is a quite tractable problem. The "enemy", such as it is, is the calcification that follows from an existing install base. And all cultural questions aside, the US simply has a very large existing install base of aged institutions and systems.
(4) Talent allocation. Maybe there's something about a substitution effect in where smart people go... perhaps certain sectors are hiring so many of the best people that other sectors have suffered. Most readers will probably have seen versions of the famous innovator immigration chart. I'd be interested in sectoral versions of this. There's a 1991 paper from Kevin Murphy about some versions of this question, which includes this eye-catching table:
His conclusion: “Our evidence shows that countries with a higher proportion of engineering college majors grow faster”. (Obviously the direction of the causality is still a question.)
More broadly, lots of papers at this point show how across-firm productivity dispersion is increasing.. it's possible that something is happening such that (a) certain kinds of firms (“frontier firms” in the paper just linked) are now disproportionately able to afford to hire the best people, or (b) dynamics around firms have changed such that it's much harder for most firms to be as productive. Anyway, this is a big topic in its own right, so I won't say too much here except to flag that "allocation of talent across sectors" is a big question and seems plausibly the source of a significant effect. Maybe people are just working on the wrong things. I wish there were more analyses here.
(5) Our problems are wickeder. While we have to be careful to not over-diagnose explanations involving low-hanging fruit (since they can easily be excuses), I think it is clearly the case that the major open problems in many domains involve emergent phenomena and complex/unstable systems that often have lots of complex couplings and nonlinear effects and so on. In biology, cancer, autoimmune conditions, anything involving the microbiome... these are all just intrinsically harder problems than individual infectious diseases. In computing, modern machine learning is much more about experimentally figuring out what works in an emergent sense than, say, operating system or network protocol design, which are more about top-down architecture. (Those latter domains have their own emergent phenomena, too, but they're more the exception than the rule.) You could probably extend the argument to materials science or condensed matter physics... these aren't as neatly characterizable in closed forms as, say, basic mechanics or thermodynamics. It's hard to say whether the proportion of important problems that are "wicked" has increased but I think it's plausible that it has.
This answer is already too long, so I'll stop here.
N.S.: So if I'm reading you right, you think a modern culture doesn't value technological progress enough. Why do you think this culture changed? Did we simply get enough stuff that getting yet more became less urgent for us, and we instead started to care more about zero-sum fights over social status? That's what happens when you climb up the ladder of Maslow's Hierarchy, right?
Maybe we just aren't envisioning the cool new stuff we could get? That brings me to another question: What are the new technologies that humankind needs the most right now? Many of us have been saying that cheap (green) energy is the biggest, and of course everyone is excited about vaccines right now. Would you agree? And what others would top your list? What marvels that we could create today -- or are already creating today -- do we not even realize we need?
And that brings me to yet another question, of a more personal nature: What got you interested in creating technology in the first place?
(1) To whatever extent that cultural change is one of the root causes... I don't really know why it changed or how to best characterize that. I'd probably first try to better understand what happened between 1945 and 1970 in the US, since that seems to have been a locus, but maybe that's a parochial perspective. (Even though I'm Irish.) And I'll reiterate the point that most cultures through time haven't been, I think, especially conducive to progress or innovation, so I'd also invert -- maybe we should just identify those that were and try to figure out what it was that was special about them. As a meta point, cultural questions are tough for anyone to write about since even true diagnoses will never be wholly convincing (we have no counterfactuals) and because culture is a topic that people are often quite sensitive about. I thought that this was a pretty neat paper but lots of estimable onlookers stronglydisagreed. As a result of these inhibitory forces, I suspect that good scholarship here is underprovisioned.
I think that Maslow's Hierarchy-style explanations are decently plausible... it's arguably the case that enough people in the US had climbed high enough on Maslow's Hierarchy by, say, 1970 that other considerations became focal. It could also be true that time horizons shortened, maybe as a result of many needs being satisfiable quickly rather than slowly. Even if that’s locally good in many ways, being willing to operate on longer timescales is very important. It’s part of why I’m so drawn to the Long Now (on whose board I serve) and it’s the single biggest thing I’ve taken away from Derek Parfit’s work. As Jonas Salk said, it’s important to think about whether we’re being good ancestors!
(2) In terms of what the world needs, improvements in medical technology are probably still #1. Climate change mitigation technology (cleaner energy generation and CO2 sequestration and so on) is also quite high up. More broadly, we need to make all of the things that you and I enjoy every day cheap and efficient enough for billions more people to afford (with safety/security high on that list). But "need" is a tough framing. There's obviously so much stuff that would be fabulously valuable and it's hard to predict the magnitude of the impact upfront. Besides the obvious diseases, better cures for depression and mental illness and other psychiatric conditions would be hugely beneficial. $100 robotic surgeries. A machine for cheaply manufacturing arbitrary food -- a 3D printer for nourishment into which you just insert elemental "ink cartridges". (And not just for replicating already-existing foods -- the possible design space is very large!) Flying cars, obviously. (Plus space-based earth-to-earth transportation.) Fast-growing trees so that everywhere can be as blissfully arboreal as you like. Cities that look like this. Technology for comprehensively eliminating air pollution (not just from internal combustion engines but also sand, dust, etc). Ubiquitous detectors for toxins like lead, arsenic, and benzene. Smart books that are better fit for purpose. A babelfish that works. Programming environments that are less hopelessly primitive than those today. (Take Mathematica/Squeak/Genera and go far beyond them.) Better education technology for everyone... what's Khan Academy but 10 times better? Too-cheap-to-meter water desalination. Batteries with so much energy density that they need never be recharged. Nanotechnology -- self-repairing wood; flexible glass; translucent steel. Quantum computers that accurately simulate physical chemistry. Completely new kinds of matter. Better catalysts for all major existing chemical processes.
(3) Hmm... Isaac Asimov's New Guide to Science. I read that when I was 13 or 14 and thought it was just amazing. (I was an exchange student in Germany at the time. I didn’t learn much German but I did have my eyes opened to many aspects of science that I previously knew nothing about!) Some of John Gribbin's books, like In Search of Schrödinger's Cat, really inspired me. Douglas Hofstadter -- especially Metamagical Themas. (I read GEB when I was a teenager but found it a bit of a slog.) But, honestly, I think I was always interested in creating technology to some extent. I spent hours and hours playing with Lego when I was young and then transitioned pretty quickly to programming. I remember being pretty certain that I'd love programming before I'd ever written a line of code and, sure enough, I did. So, maybe it's just something about how my mind is wired.
N.S.: On the societal motivation front, what do you think of the idea -- which is super scary to me -- that war is a big driver? The Civil War gave us the railroads and accelerated manufacturing technology. World War 2 and the Cold War gave us modern electronics, vast leaps in computer science, medicinal improvements, the interstate highway system, materials science...tons of stuff, in addition to institutions like what ultimately became the NHS. Is there any chance that U.S.-China competition will drive tech forward in a similar way? And what do you think of the Endless Frontier Act?
And another question, which might be a bit of a confrontational question because you are the head of a big superstar tech company that hires lots of superstar employees. There's been this concern among some economists that leading companies are hogging all the top talent and all the intellectual property too, and that by doing so they're stopping ideas and know-how from diffusing back down to all the second-tier companies, and that this is stymying productivity growth. This possibility was raised by some people at the OECD a few years ago. Does this worry you? If all the top people in software, for example, just shuffle back and forth between Stripe and Google and a few other companies, does that mean there are fewer to bring know-how to the rest of the companies out there, or to start their own startups, etc.?
War -- I think that demand pull is real (I've tweeted a few times recently about how Schmookler's view of the world is underrated compared to Schumpeter's) and it's clear that war can certainly accelerate some technologies. The atom bomb is the obvious example but there are tons of examples from WWII -- scaled-up penicillin manufacturing or plasmapheresis, for example. They were pretty significant! Vannevar Bush suggested that medical advances made during the war may have saved more lives than were lost during it. That said, I'm ultimately very doubtful that war is necessary or anything close to the best way to achieve scientific breakthroughs. (See Field's book here too.) And even if World War 2 (for example) was unusually effective at fostering scientific invention, the relevant reference class is presumably something between "all wars" or "all wars likely to involve developed countries today" -- with respect to those, I think the EV in either research productivity or (more importantly) broader welfare looks very negative.
Endless Frontier Act -- well, I'm glad that we're thinking about these problems! Overall, my single biggest science policy suggestion would be to pursue far greater structural diversity in our mechanisms. More different kinds of grant making institutions, more different kinds of research organizations, more different career paths for participants, etc. That's not easy to do -- bureaucracies by their nature seek to standardize which this fosters homogeneity. So, to the extent that the Endless Frontier Act can bring us closer to a more structurally varied world, I'm probably supportive relative to the status quo. My biggest qualm would probably be that it combines regional development policy with scientific policy. While the political merit is easy to see, I'm not sure that that's a good idea. Talent clusters are real and I think it probably makes more sense to think about how best to improve those clusters than it does to foster underdog competitors.
Talent concentration -- yeah, per my earlier answer, I think talent allocation is a pretty interesting area to dig into. When I started college in 2006 I remember reading that something like 40% of the previous graduating year in my school went into finance, even though technology would pretty clearly (even then) have been a more sensible choice on many levels. Like other systems involving collective investment decisions, it's clear that the sociology can cause persistent misallocations or bubbles. I think it's interesting to think through how the hysteresis, group incentives, economic factors, generational shifts, status dynamics, institutional forces, etc., etc., may all play off each other. Overall, though, I worry less about any sector gobbling up all the smart people -- if there really are great opportunities elsewhere, some smart people will see that and things will eventually correct (as happened with finance and tech) -- and worry more about some sectors being structurally inhospitable to very talented people. I don't think that the ambitious upstarts who go into high-speed rail (in America, anyway) are going to have a great time or to have much success in convincing their friends to follow them. And I suspect that, for various reasons, too many domains look somewhat like high-speed rail -- what would a contemporary William Rainey Harper's experience be? There's a view that the internet is a frontier-of-last-resort and I don't think it's totally wrong.
N.S.: To expand on the idea of government's relationship to the private sector -- especially highly innovative businesses -- what's your general view of the proper relationship? When I was young, lots of Silicon Valley people held a basically techno-libertarian view that the government should stay out of tech as much as possible. In recent years it seems like the winds have shifted; some people, including your former employee Saikat Chakrabarti (whom I interviewed the other day), are proposing a much more vigorous "development state", that gets involved in all kinds of industrial policies. Then you have thinkers like Mariana Mazzucato, Simon Johnson, Jonathan Gruber, etc. who believe that government support of purposeful innovation projects (rather than just basic research) is upstream from corporate innovation. Do you have a general philosophy of how government should intervene and how it should not intervene in order to create a more robust private tech sector?
And along these lines, what do you make of the current technological competition between the U.S. and China? Real, or hyped? Any thoughts on what our government should be doing to strengthen our hand? If you're not allowed to talk about this, I completely understand!
I'm a big fan of "government as buyer". I was reading this report a few days ago that covers the NASA COTS program that SpaceX benefited from and makes the (sensible, I think) case that something like this could work well for nuclear energy. Similarly, while the US government's response to the pandemic was overall very poor, Operation Warp Speed specifically was an excellent idea and has worked out great -- the Novavax vaccine wouldn't exist without it. And there are lots of antecedents here too if you go back further... NASA purchased a majority of all integrated circuits manufactured in the early/mid 60s. Over the decade, they fell in cost from ~$1,000 to $1 apiece, largely because of production increases and manufacturing learning curve progress stimulated by the government.
I also think that certain kinds of R&D are public goods and that they'll very likely be underprovided without deliberate mechanisms to address that (such as public funding of R&D), and I'm conceptually strongly supportive of those mechanisms. High-quality R&D work is among the most important things for our society to effectively support. As an aside, this view of the role of government is an area where I find myself often disagreeing with libertarian-inclined individuals. While I consider myself strongly pro-free market and pro-freedom, I am not a libertarian, and I think this is the kind of place where a traditionally libertarian approach simply doesn’t have much that’s useful to say.
Now, all of this positivity around the role of government notwithstanding, I think that Mazzucato overstates her case pretty substantially. Jose Luis Ricón wrote a very comprehensive and I think correct critique.
On the Johnson/Gruber work, I think that we should welcome informed and ambitious attempts to tackle tough questions, as they do in Jump-Starting America. It's a lot easier and safer to say something narrowly correct on a less important issues. So, kudos to them! That said, I have two main issues with work like this. (There's lots like it; I don't mean to single them out.)
First, my view is that it's just not that meaningful to talk about "science funding" as a lump sum or to talk (as they do) about the aggregate returns on federal research dollars. Which dollars and allocated how? HHMI spends a lot less than the NIH does (about 98% less) and yet its awardees win a lot more prizes per dollar. You could protest that HHMI cherrypicks talented people -- and they certainly do -- but that's the point: are we talking about the ROI of funding the best people or something else? In innovation, imputed average elasticities are going to be extremely misleading unless you're very careful. (Would you get twice as many Ronaldos and Messis if you doubled soccer funding? Or, to be parochial, the returns to tech investing may be incredible or terrible depending on who's doing the allocating, and the outcomes on the current margin would depend a great deal on the particulars.) So, I think that the industrial policy folks too often talk about "the returns to publicly-funded R&D" as a monolithic whole. I would ask them: exactly how will you choose who gets funded? What will the relevant incentive structure for those people be? What's your theory for how they'll do top-tier science/research/innovation/etc.? These are tough questions! Building systems that allocate capital well at scale and through time is hard. If they have good answers, I'm probably supportive... more experimentation would be great. If not, I'm less hopeful.
Second, I think that Johnson/Gruber also make their job too hard by again combining regional development policy with science policy. Regional development policy is an important problem, don't get me wrong! I'm sympathetic to List-ian views and the idea that the state should be a fairly active participant in the process, and, as a European citizen, I appreciate that the EU engages actively with these questions. But Ireland, say, didn't need to invent the transistor or the endoscope in order to benefit immensely from their existence. As a society, our first and overriding priority should be to generate the innovation at all, anywhere. If I were setting US policy, I would argue for treating these (how best to boost science funding and how to foster regional development) as quite distinct questions. If we somehow get really good at both, we can always go for bonus points in the future by tackling the two issues in conjoined fashion.
To your last question, the broad competition between the US and China is pretty real in the sense that many in both countries' ruling apparatuses view it as such. And it's certainly the case that the Chinese Communist Party sees Western values, liberty, and liberal democracy as competitive (and threatening). In tech in particular I think it’s important to recognize that Chinese companies and the Chinese government have somewhat different outlooks and goals, coerced oaths of fealty notwithstanding. (For example, Zhang Yiming, the ByteDance founder, makes no secret of his affinity for Hayek -- the latter not exactly one of the leading proponents of Party-aligned thought!) Within the technology ecosystem, I think that most companies simply want to work together effectively however it makes sense and aren’t particularly ideologically motivated. (I myself admire many Chinese CEOs and companies a great deal.) The challenge for everyone both in and outside of China is the occasional and unpredictable intercession or retribution from the Communist Party. Because it’s mostly driven by politics from above, though, I don’t think that “tech competition” is ultimately the right framing... I think there’s government-driven competition that sometimes shows up in tech and in other sectors whether they like it or not.
I'm out of my depth commenting on what the US government should do. I'll just say that I find it somewhat discouraging that one of the biggest and clearest violations of liberty and human rights of our era -- the treatment of the Uyghurs by the Chinese government -- receives as little condemnation from those in power as it does. I get why, of course: it's bad for business. But I think that our support for human rights and liberty should be somewhat closer to an absolute. (On this topic, 70% of Americans agree.) In his commencement address at Harvard in 1978, Aleksander Solzhenitsyn claimed that civic courage was in decline. I don’t know whether or not it was (or is), but, whatever the trendline, it seems important to me to ensure that it doesn’t.
N.S.: My own problem with the idea of the "entrepreneurial state" is that I think it frames government as a competitor to private entrepreneurship, when in fact it's usually just upstream. Almost any technological innovation can be traced back to some government-funded project, at least these days. We can ask whether that means government is necessary for the upstream science or whether it just offers its dollars unnecessarily. But either way, it seems like governments and companies just do different types of innovation (though the big corporate labs of the mid 20th century blurred this distinction). Government does more scientific research, companies do more applied product innovation. Which of course makes sense in terms of economic theory, since you can sell products, but scientific discoveries are hard to monetize and quickly leak across institutional and national borders. Ricón seems to recognize this innovation supply chain, drawing a distinction between "the research side of the technologies (Scientific support)" and "the conversion of research into saleable artefacts". Do you think that this supply chain is a good way to think about government and industry with regards to innovation?
And I know I should let you go and this interview is getting pretty long, so one last question: Other than expanding Stripe into a world-conquering technological behemoth, obviously, what's on your plate? Are there philanthropic causes, policy efforts, or lines of intellectual inquiry that you'll be pursuing in the near future?
I think the supply chain view is very directionally correct though there are obviously lots of examples of companies undertaking quite fundamental research and universities producing work that's essentially immediately commercializable (like Genentech or Google). The jet engine and transistor, two of the great inventions of the twentieth century, were developed outside of academia, for example. Toyota and Panasonic have been doing a lot of work on solid-state batteries, which would be a very big deal if realized, and Toyota claims that they'll have a running prototype this year. (As a random tangent, I find it very inspiring that John Goodenough, one of the inventors of the lithium-ion battery, is 98 and also currently working on solid-state battery research, with publications and patents actively flowing. Goals!) Anyway, I believe in messy innovation clusters and the supply chain model might suggest more separateness and unidirectional flow than is actually optimal (or manifested in practice).
As for me... well, Stripe keeps me plenty occupied. Most of the world is unbanked! Pretty much every aspect of internet commerce is still shockingly primitive. I read this paper this morning about some of the effects of fast internet on employment in Africa... "fast internet" is at this point a pretty mundane phenomenon for many of us and yet it turns out that its provision may be one of the very best developmental interventions. So, I remain a very big believer in infrastructure, and I think that Stripe is still early in its journey to unlock all sorts of entrepreneurship and economic activity that wouldn't otherwise have occurred. Almost every week we ship an improvement that makes hundreds of thousands or millions of businesses better off -- and our ability to make such improvements is growing and not shrinking with time. How cool is that?! More broadly, John and I are thinking a lot about whether there are things we can do to help grow the tech sector in Europe, and we recently helped launch a new program that we co-developed with the University of Limerick. Outside of Stripe, my partner works as a biomedical scientist, and we have a few ideas in that space. Maybe for a future interview.