50 Comments

An excellent write up. I particularly appreciate that the authors highlight the revolutions in data analysis and -omics technologies, which are arguably far more important to fundamental progress in biological engineering that CRISPR and other editing technologies are. With that said, I would offer one small criticism (apologies for the long incoming response).

While I broadly agree with the author's and your own optimism regarding the pace and potential of progress within biotech, I do worry that the complexity of biological systems and the resulting difficulty of consistently and accurately engineering them isn't given its proper weight. The model of gene -> mRNA -> protein -> trait is a simplified one that overlooks several other mechanisms and systems of control at each step in the process. It's not just a matter of making more of a transcript to make more of a protein to produce a trait.

It's been known for some time that changes in the level of mRNA transcripts only correlate with protein levels with an R^2 of 0.5 or so (obviously that is an average when looking broadly across different cell types and species). The difference comes about from various processes that determine whether or not a transcript is converted into a protein, like the rate of transcript degradation or the rate of ribosome loading. Additionally, that's only considering the protein coding genes and ignoring the vast number of RNA's that appear to play regulatory roles, the variety of which is so great that I've honestly lost track of all the acronyms that have been invented to categorize that (ncRNA, lncRNA, piwiRNA, siRNA, miRNA, etc.).

Once you actually get the protein produced there are addition levels of control affecting its function. Post-translational modifications, wherein a molecule like phosphate or glucose, is attached to the protein in a way that alters its specificity and/or reaction rate appear to be pretty ubiquitous in the cell and can alter phenotypic traits all on their own. And then of course there's the metabolome, the collection of all the various other metabolites that make up the cell and can alter the rate and direction of metabolic pathways through positive/negative feedback loops. And that's just the stuff we're aware of. I only recently learned of emerging work on a whole 'nother level of control involving tRNA's (the RNA molecules which bring individual amino acids to the ribosome for construction into proteins). Apparently many if not most of the bases of tRNA can be subject to their own modifications, each of which alters the probability that a given tRNA will be involved in protein translation and (presumably) thereby influencing protein production.

None of the above is to say that I'm not optimistic. I am. But it's worth remembering that our efforts to understand biology at a fundamental level have up until very recently been like attempting to understand the engineering principles of an alien supercomputer written in a language we can't understand. Mostly we've just been breaking things and seeing if it produces any interesting or notable effects and even with the emerging revolutions our progress with biology will still involve a fair amount of that trial and error. At least until we have a true mechanistic understanding of how all the interacting components of a cell actually lead to an observed phenotype.

Expand full comment

My wife is is biotechnology PhD, and I have been following this domain for the last 13 years or so. I can say that the biotechnology hype is going strong for the last 30 years or so. To me, it's kind of like flying cars, android robots, cold fusion or manned mars exploration. We are told all the time that we are at the tipping point, at the edge of the breakthrough. But it never happens. The hype is pushing more and more young people to study the field, which results in overproduction of graduates with poor career prospects, as well described here: https://goodscience.substack.com/p/texas-gas-stations-nih-sponsored

My wife's graduate program included extra funding from a government program, created based on the government's prediction of high market demand for biotech graduates. After graduation, it turned out that the market demand for biotech graduates was actually very low. The biotech PhD meme pages suggest that the same thing happens all over the world.

I think the biggest gap in these kind of analyses driving the hype is that they are essentially missing the adoption cycle and assuming that all breakthrough technologies are adopted instantly by everyone. This is probably a bias resulting from quick adoption cycles of software and consumer electronics. My experience shows that the typical adoption cycle takes 20-30 years - from the moment the technology first becomes available, to the point it becomes widely adopted by the business and becomes the new standard. For all kinds of technology I worked with - mobile phones, office computers, engineering software such as CAD or FEM analysis, modern data analytics - the business adoption cycle worked that long, and from what I have seen in biotechnology, the adoption cycle seems similar if not longer.

Expand full comment

Excellent article.. well written. It is not easy to combine scientific, technology, and historical information into a tight piece. well done !

On the content, I agree that all the advances mentioned are real. However, much as we have discovered in all the other AI connected applications (NLP, Autonomy, etc).. AI is a great performance aid, but does not fundamentally shift knowledge. As an electronics person, the current state of biological systems reminds me of the early days of electronics. Yes.. Moore's law enabled exponential size/speed....what is missed is that design had to keep up with this complexity. The key tool used by design was layers of abstraction which were supported by software tools (a whole industry called EDA). These abstractions allow companies like TSMC to exist. Digital Biology seems on the cusp of discovering some of these abstractions. Interestingly, the existing EDA companies are making investments in the biological Software space.

Very interesting article..thanks

Expand full comment

A lot of great pints here but this is a weird take “but bad news for the generation of grad students who spent their entire Ph.D. working out the crystal structure of a single protein!” The only reason alphafold was possible was the exhaustive effort of these students without their work there would be nothing to train the model on so actually it’s great news that their work is now even more useful. Also it can’t beat xrd accuracy yet and is comparable to a lot less accurate nmr measurements. Finally I think this post is too optimistic about automation. There is a huge amount of information like ion distributions and protein dynamics where we have no experiment techniques that can measure it. Automation is not going to help with that. An exciting new technique called neural network potentials will eventually solve this problem but it’s a few years away from being ready.

Expand full comment
Feb 18·edited Feb 18

I think this is true, but you may be underweighting the risk of making bio-engineering so cheap that some lunatic can cook up a bio-weapon in their garage. Right now a disaffected twenty-something with more intelligence than wisdom can engage in various acts of hacking. What if somebody as smart as the Mirai kids ( https://www.wired.com/story/mirai-untold-story-three-young-hackers-web-killing-monster/ ) and as angry as one of the numerous school shooters we've seen over the years, turned their hand to making the next Spanish Flu? Something as lethal as the original SARS, and as contagious as the modern incarnation of COVID (or worse, measles). Like this isn't even all _that_ speculative, we have people _doing_ "gain of function" research already, with the goal of understanding these mechanisms.

There's a ton of upside here, but also a lot of risk if we don't regulate and track the reagents and equipment involved.

Expand full comment

Hubris. Dangerous overconfidence. A complacent and arrogant seizure of powers to which one has no right. The ancient Greeks knew it well, their myths chronicling again and again the disasters attendant upon the transgression of proper human limits.

Phaeton, who thought he could control the Sun’s chariot and wound up scorching the earth until Zeus had to take him out. Icarus, who flew too close to the Sun, and did his deep dive back to earth. Salmoneus, who got his ass handed to him by Zeus for pretending to BE Zeus. Etc.

What astonishes me is the contrast between the techno-optimist’s incredible intellectual power and the glaring lack of this very basic, philosophical self-reflection. This is the line between knowledge and wisdom. We have erased it so wantonly and thoroughly that the men in the lab coats can, in response to any cautionary words, only look at you, ironically enough, with the blinking incomprehension of the religious fanatic.

For this is what “techno-optimism” is, despite its unassuming moniker (what could be more unassuming than being “optimistic” about our new wonderful “technologies”?) So tell me, ye techno-optimists, what ARE your limits? Who among your sect sets them? What basic philosophical principles do you use to formulate those limits, if indeed you have any? If not, what are first principles you rely upon to justify that lack of limits?

Could you please for a moment, for us mere mortals, train your laser intellects on the philosophical and moral foundations of your faith? We who live in the sandbox in which you play God deserve to know.

Expand full comment

While these advancements are tempting, Michael Crichton had a valid point in regard to humans playing God with nature. We are never quite as clever as we think.

Expand full comment

Excellent. Thank you for publishing this, Noahpinion, and to Joshua and Kasia for spelling this out.

This is a very good overview for a general audience. I appreciated the mention the creation of a Foundational Model of human biology for better outcomes utilizing Artificial Intelligence.

I appreciated this bit of Negentrope's comment : "...it's worth remembering that our efforts to understand biology at a fundamental level have up until very recently been like attempting to understand the engineering principles of an alien supercomputer written in a language we can't understand. Mostly we've just been breaking things and seeing if it produces any interesting or notable effects..."

Expand full comment

This is a great reference point for what a human labor workflow looks like in a highly AI-integrated industry. The human adds new processes and data streams. The human curates the output. The human integrates multiple lines of evidence. The human handles the clinical trials. The human decides the profit opportunity. But the industry is not possible without the power of massive AI scaling for incredibly complex problems. Viewed another way, the humans job is to feed the AI model what it needs to do its job effectively.

Jensen Huang was asked about the prospects for the robotic industry with the advent of powerful AI, and he described a similar model with a similar emerging opportunity. An AI robotics compant still has to do the robotics testing, development, and process integration - the AI doesn't obliviate the need for that - but AI makes possible a rapid and scaling of this complex multifaceted process.

Expand full comment

> Each of us is comprised of trillions of cells

Each of us is composed of trillions of cells. Each of us comprises trillions of cells. Either is acceptable.

"Comprise": to include, contain, encompass, circumscribe.

comprise (v.): early 15c., "to include," from Old French compris, past participle of comprendre "to contain, comprise" (12c.), from Latin comprehendere "to take together, to unite; include; seize; to comprehend, perceive" (to seize or take in the mind), from com "with, together," here probably "completely" (see com-) + prehendere "to catch hold of, seize".

Expand full comment

If we are now in Biology where we were 50 years back with software, it will be an ... exciting experience. Will we put "buggy code" in living creatures (food crops? Extinct animals?) and have them reproduce? Almost certainly based on our history with fossil fuels, plastics, the internet, etc.

Expand full comment

Whoever thought the dry sciences and wet sciences would intersect to create solutions? Companies such as EXAI (Cambridge, England) are using AI/ML to streamline to time it takes to develop drugs for trials. Beam Therapeutics (Cambridge, MA) has developed single-base editing, more advanced that CRISPR. Some diseases are a by-product of a flawed single-base pair. Because genes work in concert with other genes, there are dangers inherent in these types of treatments. Why cut a strand of DNA when you can restrict your target to the replacement of a single flawed base pair? This may be less likely to trigger a dangerous reaction of other genetic material.

Expand full comment

Sounds great. When will they be able to treat prostate cancer without giving men hot flashes?

Expand full comment

This is a nice article. Biotechnology is the power to control life itself, and so many of its applications— in saving billions of baby chicks from being culled, in cell therapies, etc— go unnoticed. We publish a new essay on biotechnology every week, if you’re interested in learning more.

Expand full comment

Great article, but I’m surprised you didn’t mention competitions in bioengineering like iGEM - it showcases just how powerful and creative the potential in it is. Well worth checking out their website and having a look at the variety of projects that different universities work on (all of the projects are also fully laid out on their own websites to allow anyone interested to have a read into how it all works)

Expand full comment

Will biological engineering free us to be the angels we were meant to be or make us into perfectly subservient robot slaves for the capital? https://en.redjustice.net/kapitalets-robotslavar-eller-frigjorda-anglar/

Expand full comment