51 Comments

An excellent write up. I particularly appreciate that the authors highlight the revolutions in data analysis and -omics technologies, which are arguably far more important to fundamental progress in biological engineering that CRISPR and other editing technologies are. With that said, I would offer one small criticism (apologies for the long incoming response).

While I broadly agree with the author's and your own optimism regarding the pace and potential of progress within biotech, I do worry that the complexity of biological systems and the resulting difficulty of consistently and accurately engineering them isn't given its proper weight. The model of gene -> mRNA -> protein -> trait is a simplified one that overlooks several other mechanisms and systems of control at each step in the process. It's not just a matter of making more of a transcript to make more of a protein to produce a trait.

It's been known for some time that changes in the level of mRNA transcripts only correlate with protein levels with an R^2 of 0.5 or so (obviously that is an average when looking broadly across different cell types and species). The difference comes about from various processes that determine whether or not a transcript is converted into a protein, like the rate of transcript degradation or the rate of ribosome loading. Additionally, that's only considering the protein coding genes and ignoring the vast number of RNA's that appear to play regulatory roles, the variety of which is so great that I've honestly lost track of all the acronyms that have been invented to categorize that (ncRNA, lncRNA, piwiRNA, siRNA, miRNA, etc.).

Once you actually get the protein produced there are addition levels of control affecting its function. Post-translational modifications, wherein a molecule like phosphate or glucose, is attached to the protein in a way that alters its specificity and/or reaction rate appear to be pretty ubiquitous in the cell and can alter phenotypic traits all on their own. And then of course there's the metabolome, the collection of all the various other metabolites that make up the cell and can alter the rate and direction of metabolic pathways through positive/negative feedback loops. And that's just the stuff we're aware of. I only recently learned of emerging work on a whole 'nother level of control involving tRNA's (the RNA molecules which bring individual amino acids to the ribosome for construction into proteins). Apparently many if not most of the bases of tRNA can be subject to their own modifications, each of which alters the probability that a given tRNA will be involved in protein translation and (presumably) thereby influencing protein production.

None of the above is to say that I'm not optimistic. I am. But it's worth remembering that our efforts to understand biology at a fundamental level have up until very recently been like attempting to understand the engineering principles of an alien supercomputer written in a language we can't understand. Mostly we've just been breaking things and seeing if it produces any interesting or notable effects and even with the emerging revolutions our progress with biology will still involve a fair amount of that trial and error. At least until we have a true mechanistic understanding of how all the interacting components of a cell actually lead to an observed phenotype.

Expand full comment

Coming from this background as well, I recognise that it’s really complex, and there’s the whole dark aspect of its potential to account as well. However, I still think we have a great need for more people to feel less intimidated by the challenges in biological engineering. The more attention it gains, the more problem solvers will on-board and the regulative bodies in biosecurity are bound to grow. Point is, the positive potential in biological engineering is far more important than the downsides and difficulties of it. The creativity that it offers in problem solving is absolutely amazing, and students can tap into that creative potential better than anyone. (Check out igem.org)

Expand full comment

Certainly I don't mean to discourage people wanting to push the field forward. I actually think is may be the most exciting time to be in biology in quite awhile. All these various technologies coming together at the same time is kind of an amazing confluence and puts us on the precipice of finally answering the one of the fundamental questions of biology: how exactly does a genotype lead to a phenotype. But the path to get there is going to be long and hard and we owe it to the public to temper our optimism with a reasonable explanation of the challenges.

Expand full comment

Answering that question will definitely be complicated, and I don’t think there will ever be a better way to explain it apart from ‘it’s complicated’. In fact, for biological engineering, that question is the answer. We don’t owe it to the public to say that it will be difficult to get there, but instead that on the path of discovering the complexity of this question, we are discovering answers to other, even more important questions.

Expand full comment

Very good points you raise here---one of the biggest issues with LLMs and tensor technology in general is they give you probabilistic outputs, but do not shed any light whatsoever on the actual mechanistic underlying processes. So we may be able to use these systems to come up with new drugs or understand some factors that might lead to a better treatment---I question whether or not these methods have any significant role to play in our actual understanding of the mechanisms. So all of this is to say this might be great for drug companies, but it may not actually help us understand the human body at a significantly faster rate, obviously that might not matter---but we just might end up with more drugs that seem to have some form of efficacy but we have absolutely zero understanding of its mechanism. It is good---but the process of understanding the human body continues and I see this as a tool and not a complete paradigm shift.

Expand full comment

A partial counterpoint -- I don't think LLMs or similar technology can replace the pure scientist in biology, but it seems like finding more things that work would still be informative. Once you know what the effects of a huge number of molecules are, and which proteins / other biomolecules they directly interact with, that's a lot of data about what those biomolecules do in vivo. Or at least what are some of the systems they are involved with. (It may be very hard to reverse-engineer -- we study the brain, for instance, by correlating physical lesions to functional deficits, but I once heard an analogy of a video game console that gets increasingly degraded as you damage parts. Eventually it can no longer play Super Mario Bros, but the last element to fail before that point is not the "Mario circuit," it is a circuit used in many games that happens to be required for some indispensable function in Mario.)

As difficult as biology is, and as limited as traditional methods are, we've managed to figure out a lot with just those old methods. New things will help -- there's positive feedback between pure science and applications, where new "pure" knowledge lets us do applied science more efficiently, and applications give us tools, and sometimes useful insights, for pure science.

I think I agree with your conclusion, though -- this is a useful tool, but not a miraculous shortcut to "solving" biology.

Expand full comment

You both make valid points and I think we're all broadly in agreement. These models haven't replaced the need for human scientists and they don't as yet provide any real insight into the mechanistic basis behind a lot of cell biology. That being said, without them I don't see us getting to that point.

All of this is making think back to the early days of "systems biology" (or at least when I first started becoming aware of it). The dream was that we'd be able to build molecular models of enzyme kinetics and protein interactions using a combination of all the data collected by -omics and powerful models of cellular dynamics (e.g. reaction rate or FBA). Once we had those, they could drive empirical research by elucidating the likely molecular function of individual cell components which we could then target for knockout or some other kind of intervention. That...hasn't really happened. At least not to the degree that many hoped. Much like how sequencing the human genome was supposed to bring a bonanza of medical cures and then didn't, it turns out biology is just way more complicated than we thought.

But, and this I think is the most important takeaway, the goal of systems biology is still a worthy goal. More importantly, I still think the basic approach (i.e. massive amounts of data combined with models of cellular processes) is the right one. It just turns out that we need A) WAAAAAAAY more -omics data and WAAAAAAAY better models, and B) still a lot of traditional reductionist-based research on individual components. So I think you're both right. These tools won't solve biology on their own, but they're necessary to get us there in combination with the tools we already have.

Expand full comment

Talking about how the complexity of biology can now be “engineered” makes me think of urban planning problems. Jane Jacobs wrote about cities being complex problems, and Chuck Marohn of Strong Towns has taken up the mantle in our current era. Traffic “engineering” is a horrible pseudo-science full of licensed professionals who have used their “expertise” to deceive and/or browbeat the public into building a transportation system that doesn’t work, kills people, and does irreparable harm to our cities and towns, all because they did not respect the complexity of the system they were dealing with.

Expand full comment

Completely agree as a PhD molecular biologist who’s been in relevant fields since the late 80s. I’m highly optimistic, and skeptical of drawing tech-to-biology inferences. I haven’t read the book reviewed in short article, but it sounds like a useful addition: https://www.nature.com/articles/d41586-024-00327-x

Expand full comment

I haven't read the book either (first I'm hearing about it). If the write-up is accurate, I worry that it's jumping from one conclusion I agree with (getting from genotype to phenotype is dizzyingly complex and traits don't easily map to individual genes) to one I don't (therefore, cells cannot be analogized to computers). I actually do believe that cells and computers share a number of similar features and, if pressed, I might even go so far as to say that they're properly considered subsets of the same type of phenomena. What separates them IMO is degree (i.e. the far, far greater complexity of the network of pathways/programs making up biological systems, as well as there fundamentally stochastic nature) rather than a fundamental difference in kind.

Expand full comment

Sure—a question of degree, and how far to push the analogy. I have a hard time fitting epigenetic alleles, which I think of as plasticity in the hardware (or firmware), and which can change pretty dramatically (and stochastically) in response to environmental cues. But I admittedly am far from a computer scientist or engineer!

Expand full comment

I mean sure, but most of the miraculous medical advances we've made over the years have been basically just based on trying to break things in smart ways. So I have every confidence that breaking things in more informed ways will lead to even more important benefits.

Expand full comment

My wife is is biotechnology PhD, and I have been following this domain for the last 13 years or so. I can say that the biotechnology hype is going strong for the last 30 years or so. To me, it's kind of like flying cars, android robots, cold fusion or manned mars exploration. We are told all the time that we are at the tipping point, at the edge of the breakthrough. But it never happens. The hype is pushing more and more young people to study the field, which results in overproduction of graduates with poor career prospects, as well described here: https://goodscience.substack.com/p/texas-gas-stations-nih-sponsored

My wife's graduate program included extra funding from a government program, created based on the government's prediction of high market demand for biotech graduates. After graduation, it turned out that the market demand for biotech graduates was actually very low. The biotech PhD meme pages suggest that the same thing happens all over the world.

I think the biggest gap in these kind of analyses driving the hype is that they are essentially missing the adoption cycle and assuming that all breakthrough technologies are adopted instantly by everyone. This is probably a bias resulting from quick adoption cycles of software and consumer electronics. My experience shows that the typical adoption cycle takes 20-30 years - from the moment the technology first becomes available, to the point it becomes widely adopted by the business and becomes the new standard. For all kinds of technology I worked with - mobile phones, office computers, engineering software such as CAD or FEM analysis, modern data analytics - the business adoption cycle worked that long, and from what I have seen in biotechnology, the adoption cycle seems similar if not longer.

Expand full comment

I don’t think the point of the article was to predict what the biotech job market is going to be like in the future, but rather a focus on what the field is producing. And it is truly impossible to deny that the types of therapies coming online now are of an entirely different nature than the small molecule agents we were reliant on before. From monoclonal antibody therapy to mRNA vaccines (and mass produced lipid nano particles to deliver them) to CRISPR/gene editing to cellular therapies, we have multiple new classes of therapies that are in various phases of early maturation as technologies. And they’re all far enough along that we’re starting to see combinations of these approaches.

With productivity generating resources like AI and high throughput techniques plus resource constraints like capital investment, it’s hard to predict what the job market will be like. But for the patients who these therapies end up helping, it’s an incredibly exciting time. So much so that I think we need to be doing a better job to prepare the public for it. The future is a time of more mRNA vaccines and gene editing, not less.

Expand full comment

Exactly. This is a great time to be going into biotech, but maybe you don't need a Ph.D. to do it. A modern biotech will need some Ph.D.'s but also hardware and software folks. If anything I think biotech is far less hyped than chat-based AI or formerly crypto. And that's despite delivering real, amazing things like mRNA vaccines and a cure for sickle cell anemia.

Expand full comment

I'm on the academic side of one of these leading edge fields and I think we're getting better at focusing on translating success. There are still big barriers in moving from animal to pre-clinical to early clinical studies though and it'd be nice to see the regulatory side catch up a bit to the pace of the scientific discovery.

Expand full comment

The article is in the usual tone of hype about imminent breakthrough/singularity/tipping point. I just wanted to point out that we have heard this many times before - next gen sequencing, GMO foods, CRISPR, cultured meat - the list goes on. In the end, it never turned out to be a breakthrough, but a slow, incremental progress - evolution instead of revolution. I don't see how could it possibly turn out differently this time around.

Biotech is constantly prone to overhype - it's hard and mysterious (Arthur C Clarke - "Any sufficiently advanced technology is indistinguishable from magic"). At the same time, the adoption is double bloated by regulation and bioethics (VERY hard to change the public opinion here), both somewhat intertwined. This is why the GMO foods are constantly struggling worldwide, and the cultured meat business hasn't really taken off anywhere yet. The mRNA vaccines are a notable exception, where the regulatory approval and adoption was greatly sped up in light of the Covid crisis, but it still faced major backlash from all types of skeptics.

My main point was that the constant biotech overhype is driving people into making bad investment and career decisions. Again - it's always a (slow) evolution, never a revolution.

Also, it is worth to point out, that the guest authors are founders of a biotech startup - they are not impartial experts, and it is in their very best business interest to drive the biotech hype as much as possible.

Expand full comment

Biotech is prone to overhype and also boom or bust cycles. But your point conflates prior cycles which produced very little that made it into clinical practice with the current cycle. Monoclonal antibody therapies and immunotherapy have both matured to the point that they're being used in broadly disparate diseases and also have set off the next wave of investment to fund future developments, especially with combinatorial approaches.

For example, population level genome sequencing allowed for the identification of PCSK9, a previously unknown molecule that plays a key role in cholesterol homeostasis. Instead of having to find small molecule inhibitors, researchers were able to more quickly develop monoclonal antibodies to target it, and decreased the time from discovery to enrolling in clinical trials, leading to two approved drugs so far.

So I really don't understand the pessimism or contrarian take here. Being skeptical is critical, but I find it too often becomes a crutch. A bias towards skepticism is useful, but we have to be able to point out the true breakthroughs.

Expand full comment

It’s the classic fallacy where he’s thinks past events are predictive of a current one, instead of judging it on its merits.

Sure there was past hype cycles, but it is important to our current moment on what is being produced. And it seems that there is significant progress being made.

Expand full comment

Idk “notable exception exception” I think covid demonstrated just how much progress is being made, considering that in a year, 4 highly effective vaccines were made and distributed

Expand full comment

GMO foods are a proven entity and arguably more economical. Cultured meat is nowhere near any kind of economic completiveness and with it's pharmaceutical levels of sterility required for production it doesn't look promising any time soon. So I wouldn't lump cultured meat with GMO's Franken food image problems, it's impracticality, as yet, has made it not ready for prime time.

Expand full comment

I totally hear you. I did a phd on computational biophysics and did a postdoc on bio-informatics (with some applications of ML); but the industry is not there (or close to there); so I have to move to software and DS as a career path.

The struggle is real and is there.

* for ai industry is ahead of academia

* for comp science and SWE industry is with ai

* for bio tech industry is much behind ai.

Expand full comment

How about a flying-time travelling car that runs on cold fusion?

https://www.youtube.com/watch?v=ptlhgFaB89Y

Expand full comment

Exactly as shown in the movie - we would have it already by 2015 if all the hype about imminent breakthrough in these technologies was true.

Expand full comment

Excellent article.. well written. It is not easy to combine scientific, technology, and historical information into a tight piece. well done !

On the content, I agree that all the advances mentioned are real. However, much as we have discovered in all the other AI connected applications (NLP, Autonomy, etc).. AI is a great performance aid, but does not fundamentally shift knowledge. As an electronics person, the current state of biological systems reminds me of the early days of electronics. Yes.. Moore's law enabled exponential size/speed....what is missed is that design had to keep up with this complexity. The key tool used by design was layers of abstraction which were supported by software tools (a whole industry called EDA). These abstractions allow companies like TSMC to exist. Digital Biology seems on the cusp of discovering some of these abstractions. Interestingly, the existing EDA companies are making investments in the biological Software space.

Very interesting article..thanks

Expand full comment

A lot of great pints here but this is a weird take “but bad news for the generation of grad students who spent their entire Ph.D. working out the crystal structure of a single protein!” The only reason alphafold was possible was the exhaustive effort of these students without their work there would be nothing to train the model on so actually it’s great news that their work is now even more useful. Also it can’t beat xrd accuracy yet and is comparable to a lot less accurate nmr measurements. Finally I think this post is too optimistic about automation. There is a huge amount of information like ion distributions and protein dynamics where we have no experiment techniques that can measure it. Automation is not going to help with that. An exciting new technique called neural network potentials will eventually solve this problem but it’s a few years away from being ready.

Expand full comment
Feb 18·edited Feb 18

I think this is true, but you may be underweighting the risk of making bio-engineering so cheap that some lunatic can cook up a bio-weapon in their garage. Right now a disaffected twenty-something with more intelligence than wisdom can engage in various acts of hacking. What if somebody as smart as the Mirai kids ( https://www.wired.com/story/mirai-untold-story-three-young-hackers-web-killing-monster/ ) and as angry as one of the numerous school shooters we've seen over the years, turned their hand to making the next Spanish Flu? Something as lethal as the original SARS, and as contagious as the modern incarnation of COVID (or worse, measles). Like this isn't even all _that_ speculative, we have people _doing_ "gain of function" research already, with the goal of understanding these mechanisms.

There's a ton of upside here, but also a lot of risk if we don't regulate and track the reagents and equipment involved.

Expand full comment

Hubris. Dangerous overconfidence. A complacent and arrogant seizure of powers to which one has no right. The ancient Greeks knew it well, their myths chronicling again and again the disasters attendant upon the transgression of proper human limits.

Phaeton, who thought he could control the Sun’s chariot and wound up scorching the earth until Zeus had to take him out. Icarus, who flew too close to the Sun, and did his deep dive back to earth. Salmoneus, who got his ass handed to him by Zeus for pretending to BE Zeus. Etc.

What astonishes me is the contrast between the techno-optimist’s incredible intellectual power and the glaring lack of this very basic, philosophical self-reflection. This is the line between knowledge and wisdom. We have erased it so wantonly and thoroughly that the men in the lab coats can, in response to any cautionary words, only look at you, ironically enough, with the blinking incomprehension of the religious fanatic.

For this is what “techno-optimism” is, despite its unassuming moniker (what could be more unassuming than being “optimistic” about our new wonderful “technologies”?) So tell me, ye techno-optimists, what ARE your limits? Who among your sect sets them? What basic philosophical principles do you use to formulate those limits, if indeed you have any? If not, what are first principles you rely upon to justify that lack of limits?

Could you please for a moment, for us mere mortals, train your laser intellects on the philosophical and moral foundations of your faith? We who live in the sandbox in which you play God deserve to know.

Expand full comment

If Zeus didn’t want us to engineer biology, he wouldn’t have given us DNA sequencing.

Expand full comment

And therein lies the intellectual stultification and philosophical naivete of those who see nature solely in terms of use values. Zeus gave us DNA sequencing for the same reason he gave us the Grand Canyon—for wonder and contemplation, for poetry and art.

The bioengineer is a TOOL, useful when we need him, but not farsighted or cultivated enough to hand him the keys to the kingdom, though from the size of his salary he may sometimes draw the endearingly childlike conclusion that he is indeed worthy of those keys.

Expand full comment

The same could have been said of fire and electricity. I'm surprised you didn't bring up Prometheus while you were pontificating using irrelevant examples from mythology. If you want to convince anyone here rather than just hear yourself type, please make actual arguments instead of just flowery statements.

Expand full comment

This is excellent copy-pasta and I will be stealing it.

Expand full comment

You are writing this comment on a phone, one of the most technically advanced things ever created, built with chips that are so tiny parts of them are measured in atoms, that connect to the internet, using WiFi, that packages and sends your comment to a receiver, translates it into electric pulses or light, that connects to a server, where those electrical signals are converted into instructions on how to store your info. That info is then sent with a similar process back to your phone where your phone then interprets those signals and renders it on your screen. It is functionally magic.

All of this (oversimplified) process happens in less than a few seconds, faster than it took you to think up your overwrought comment about technologies you enjoy the benefits of. Funny

Expand full comment

duh. That’s like saying there is nothing to be said against corporate capitalism because you buy your groceries from a multinational food conglomerate. It’s an accusation of hypocrisy, an ad hominem argument. Weak.

Expand full comment

While these advancements are tempting, Michael Crichton had a valid point in regard to humans playing God with nature. We are never quite as clever as we think.

Expand full comment

Or do we actually act overcautiously because we are constantly subject to these moralistic warnings? No, we never understand everything but all technologies have risks -- both know and unknown. We can never eliminate it but we often do as much harm by holding back potential cures and advancements as we do by going too fast.

Just look at the rollout of the COVID vaccine. They could have made it available to anyone who wanted to take it much earlier (immediately after the human safety tests before knowing if it was effective...lots of people died because we couldn't risk letting people decide for themselves that the miniscule risk associated with a potentially ineffective vaccine was worth taking.

Expand full comment

"Just look at the rollout of the COVID vaccine." Or, look at the origins of Covid. Our hubris in thinking that we could prevent a massive pandemic likely caused one.

Expand full comment

Good point. But, with nature we are dealing with an every changing landscape as it evolves as well. Mr. Crichton's point, and the one I try to make, was not to cease moving forward, but to have respect for the "opponent" - Mother Nature.

Expand full comment

Excellent. Thank you for publishing this, Noahpinion, and to Joshua and Kasia for spelling this out.

This is a very good overview for a general audience. I appreciated the mention the creation of a Foundational Model of human biology for better outcomes utilizing Artificial Intelligence.

I appreciated this bit of Negentrope's comment : "...it's worth remembering that our efforts to understand biology at a fundamental level have up until very recently been like attempting to understand the engineering principles of an alien supercomputer written in a language we can't understand. Mostly we've just been breaking things and seeing if it produces any interesting or notable effects..."

Expand full comment

This is a great reference point for what a human labor workflow looks like in a highly AI-integrated industry. The human adds new processes and data streams. The human curates the output. The human integrates multiple lines of evidence. The human handles the clinical trials. The human decides the profit opportunity. But the industry is not possible without the power of massive AI scaling for incredibly complex problems. Viewed another way, the humans job is to feed the AI model what it needs to do its job effectively.

Jensen Huang was asked about the prospects for the robotic industry with the advent of powerful AI, and he described a similar model with a similar emerging opportunity. An AI robotics compant still has to do the robotics testing, development, and process integration - the AI doesn't obliviate the need for that - but AI makes possible a rapid and scaling of this complex multifaceted process.

Expand full comment

> Each of us is comprised of trillions of cells

Each of us is composed of trillions of cells. Each of us comprises trillions of cells. Either is acceptable.

"Comprise": to include, contain, encompass, circumscribe.

comprise (v.): early 15c., "to include," from Old French compris, past participle of comprendre "to contain, comprise" (12c.), from Latin comprehendere "to take together, to unite; include; seize; to comprehend, perceive" (to seize or take in the mind), from com "with, together," here probably "completely" (see com-) + prehendere "to catch hold of, seize".

Expand full comment

If we are now in Biology where we were 50 years back with software, it will be an ... exciting experience. Will we put "buggy code" in living creatures (food crops? Extinct animals?) and have them reproduce? Almost certainly based on our history with fossil fuels, plastics, the internet, etc.

Expand full comment

Whoever thought the dry sciences and wet sciences would intersect to create solutions? Companies such as EXAI (Cambridge, England) are using AI/ML to streamline to time it takes to develop drugs for trials. Beam Therapeutics (Cambridge, MA) has developed single-base editing, more advanced that CRISPR. Some diseases are a by-product of a flawed single-base pair. Because genes work in concert with other genes, there are dangers inherent in these types of treatments. Why cut a strand of DNA when you can restrict your target to the replacement of a single flawed base pair? This may be less likely to trigger a dangerous reaction of other genetic material.

Expand full comment

Sounds great. When will they be able to treat prostate cancer without giving men hot flashes?

Expand full comment

They probably can already, in mice ;)

Expand full comment

This is a nice article. Biotechnology is the power to control life itself, and so many of its applications— in saving billions of baby chicks from being culled, in cell therapies, etc— go unnoticed. We publish a new essay on biotechnology every week, if you’re interested in learning more.

Expand full comment

Great article, but I’m surprised you didn’t mention competitions in bioengineering like iGEM - it showcases just how powerful and creative the potential in it is. Well worth checking out their website and having a look at the variety of projects that different universities work on (all of the projects are also fully laid out on their own websites to allow anyone interested to have a read into how it all works)

Expand full comment

Will biological engineering free us to be the angels we were meant to be or make us into perfectly subservient robot slaves for the capital? https://en.redjustice.net/kapitalets-robotslavar-eller-frigjorda-anglar/

Expand full comment