This is a good piece on the hardware war. But there’s a layer beneath chips that’s missing from the analysis.
Noah’s right that China can’t match US chip production. The IFP data on compute advantage is solid. But compute is the foundation, not the building.
What runs on those chips matters more than who makes them.
I’ve spent the last month reading Chinese IPO filings and securities disclosures. Documents with legal liability. Here’s what they say:
Moore Threads (China’s most advanced GPU company) admits in their filing that NVIDIA’s CUDA ecosystem is “not easily surpassable” (不易逾越). So they stopped trying to beat it. They built a migration tool instead. They conceded the codebase.
But Huawei isn’t trying to win the chip war. They’re building a world where CUDA doesn’t reach. 50,000 engineers trained in Malaysia. 27,000 in Egypt. 79 Huawei ICT academies across Africa and Southeast Asia. Zero NVIDIA equivalents.
Noah mentions “Galapagos syndrome” as a risk for China. But what if that’s the strategy? Not isolation. Bifurcation. A parallel ecosystem serving three billion people who will never use American software.
The hardware controls are working on hardware. But we’re not watching the software layer. And software is what locks in advantage for decades.
Some good points but there are weaknesses in the software argument. Regarding China and relevant to the software infrastructure timeline I think you are confusing software skills for things like marketing and business skills when it comes to software importance. Google with its Tensor and other chips may not have the same ultimate model capacity and speed as current NVDIA chips but it's not using CUDA. Why couldn't China do what Google has done? My colleagues first started doing quantum chemistry calculations on GPU chips in the mid-late 2000's and at the time CUDA didn't dominate. If the AI that pays the bills for Google or is good enough for China doesn't require CUDA how long will its position hold? As a retired scientist who has lived and worked in the period of computer adoption in science with countless digital devices, chips, and software I don't think the adaptation rates to different software among engineers is going to be a problem. And AI will definitely eliminate a lot of required human coding effort for lower level coding. My only experience is the Claude code, which isn't state of the are AI I predict, and it can be impressive for pars of things like building a MacOS graphical app in Swift, but connecting to my Postgresql database in Swift took many iterations. More complex tests with Claude, asking it to restructure my Postgres database from 1st Normal to 3NF and modify my dbi code to match the change it, and test performance, so far, has not accomplished the task. And I consider that an easy but tedious task. The next real world test of Claude will be to see if it can convert all of my NetCDF (scientific data storage format) using code to HDF5 now that NCAR is on the chopping block (ncdf4 uses hdf5 so it's not that hard). I have code that accesses NetCDF data in Fortran, C, Perl, Octave, R, and Lisp, and I want all it updated to use HDF5 calls. Or modularize and update all my Fortran90 code to Fortran2023 complete with utilizing elemental procedures. A problem that AI has and will continue to is data sources for AI that is close to cutting edge science (which is where most real advances have come from) is the copyrighted input data need to really have an impact in math, physics, chemistry, material science, biochemistry, ... Of course with enough people across the world it would be possible to copy and digitize every science journal to get the needed inputs, ..., and I suspect China has thought this through and has a huge potential labor source to "steal" the data. And if you have the needed inputs you won't need as much data storage and compute capacity.
Good post. The bifurcation strategy is already happening in the commercial realm. They have their own Google, YouTube, Twitter equivalent etc. they work fine within the great firewall, some even thrive outside it (see Tik tok). If I am Xi I can definitely see this as a viable model.
But how quickly will Bifurcation translate into more powerful and efficient GPU's, like NVIDIA's H200's?
My understanding with software optimization of GPU's, is that it has diminishing returns; and that the hard cap is, and probably always will be raw computing power. If China can't yet even make 7nm chips, how is Bifurcation anything but a long-term strategy?
That our Putin-owned Prez is following the Kremlin's orders to sell H200's to China is another topic entirely.
Exactly. Consumer apps were the proof of concept. AI infrastructure is the next layer. TikTok showed them they can export, not just contain. Now they’re doing the same with developer ecosystems.
Ah yes, the old “give them our best stuff to disincentivize them from developing their own better stuff” argument.
The same logic that led us to sell our best military kit to the USSR during the height of the Cold War.
Remember back in Reagan’s first term, the deal he signed to sell Moscow a dozen SR-71s, ten squadrons of F-15s, three divisions of Abrams tanks, and ICBM inertial guidance systems? Because this would slow Soviet weapons development?
You don’t remember that? Because it never happened? Because it would be very foolish? Silly me. I must have my timelines mixed up.
Stalling China's AI progress when AGI seemed at least possible was sensible enough. Now that a growing consensus of people think AGI is not happening this decade and maybe not ever (under the current architecture) I'm not so sure.
I'm frankly more worried about the AI researchers being pushed to China by bad Trump policies than chips. They could make the next breakthrough in architecture, ie by figuring out how to build world-models or so on. That seems much more impactful than marginal improvements in a technology that will always hallucinate, does not generalize well, etc.
Are China really threatening World War III? Given their size, the size of their army and their ideological differences, I'm not saying we shouldn't be prepared for it. But, strictly entre nous, are they seriously threatening it?
I don’t think China want world domination. But they do want domination in their world (east Asia), and assurance that their access to resources remain unhindered (ie investment in South America & Africa on resource rich countries).
They certainly want unhindered access to resources. The US wants that too and is prepared to take a range of actions overseas to ensure they get it. Which country doesn't, in so far as they able to? And does China seek greater dominance of its sphere of influence than the US does in its own sphere? It doesn't really feel like it does.
Part of me wonders if China will ever invade Taiwan though, less due to the US and more due to the inherent tactical limits of attacking an island like Taiwan. The cost would be great, and even if the US does not intervene: would the juice be worth the squeeze? I am simply not sure.
If we're stressed about affordability, maybe some cutting edge Chinese imports made with the help of the latest chips are just what we need!
Trying to hold China back technologically forever because they claim Taiwan is unappealing from a humanitarian perspective, and also seems geopolitically myopic. Aren't they more likely to behave like an enemy if we treat them like an enemy?
This argument seems very under-theorized. By contrast, the theory of comparative advantage is extremely lucid.
We seem to take for granted that a technological advantage in GPUs and LLMs leads to some kind of military edge, but how could access to Gemini or ChatGPT prove decisive in a war with China? Even assuming that the US would actually go to war with China over Taiwan, which doesn't seem very likely.
There's also the question of diminishing returns with respect to advanced chips, AI, and war. Wouldn't 10 or 100 dumber things beat out 1 smarter thing? Or what if the cost trade isn't sustainable, i.e. both sides can only build things at a 1:1 ratio, but the former costs $1 and the latter costs $1000?
In the consumer world, people aren't as impressed with the newest smart phone or the role out of 5G networks, even though they're faster. There's a point where the greater processing power doesn't matter to consumers. It's hard to imagine that wouldn't be true in the military world, e.g. yes you can process information faster but you still have to move through physical space at the same speed as something with less processing power.
Here is the odd thing about Trump’s fluffing of XI.
He wants a big trade deal, and at the same time, he wants American self-sufficiency and onshoring.
Trump found out that the US does not hold all the cards in a trade war with China, as they hold rare earth processed metals. Without them, we cannot build missiles and fighter jets.
China doesn’t really want to sell us rare earths that make our military stronger, but they have political desires that Trump can fulfill. Will Trump sell out Taiwan to sell pork and soy beans to China, receive rare earths from China for one big beautiful thing?
Yes, he is going to give Taiwan to China for the above.
While America withholding advanced chips from China may be impeding China's chip progress, it's not clear to me that that would be a deterrent to China militarily.
If China can make 9 nm chips and make 10-100 times as many drones as the US does, does it matter if our chips are more advanced? The quantity alone would be too problematic. Moreover, the less advanced chips are likely a lot cheaper to make and far easier to scale. The war could begin with a 10-100 times difference and end with a 1k to 10k times difference.
I feel at a certain point, the advanceness of the chip has diminishing returns. At least on the consumer level, I think we see this with 5G not being that big of a deal and new smart phone upgrades not feeling that important. It wouldn't be surprising if similar was true for military applications.
Moreover, as others noted, it's not clear that the chips are the barrier to advance weaponry /AI right now. It's likely the algorithms. The AI field is in desperate need of new software architectures. (novel hardware architectures may also be needed given that energy use difference between a human brain and current AI tools)
By restraining China from advanced chips, we're not just forcing them to make their own chips, but also forcing them to compensate on the algorithm side and potentially hardware design side. Deepseek was a direct result of needing to innovate around their processing power bottlenecks.
On a more purely military side, the technological advantages the West had over Russia didn't stop Russia from invading Ukraine. It also hasn't proven good enough at pushing the Russians out of Ukraine.
Militarily, the US likely has to refocus a lot more on the basics--cheap, reliable weapons--instead of looking at the shinier toys.
Even without an AGI, sufficiently advanced AI could dramatically improve on our hacking capabilities.
And hacking is the skeleton key around all this stuff. Drone swarms don’t work if you’ve disrupted their control systems. Ditto missiles, jets, and whatever else.
Seems to me that’s the real reason for this race. Hacking is the magic weapon that winning the AI race gives you.
Even if hacking is a core direction of AI with respect to warfare, it doesn't change the fact that the current AI direction of scaling large language models (LLMs) isn't the way to go. It's a dead end as hallucinations / confabulations are baked into the model design. This puts a damper on their utility in cyberwarfare. Probabilities only mimic understanding to a point.
In placing these chip restrictions, we've placed more selective pressure on Chinese companies to come up with a new architecture which may bear better fruit. Chinese companies have to innovate to compete with ours. Free market logic suggests that they ultimately will come up with something. Maybe something subpar early on, but a few iterations down the road, this may end up being a better avenue.
I'm reminded of American car companies being oblivious to fuel efficiencies before the late 70s, 80s as gas was always super cheap, whereas oil poor Japan emphasized it. When America was hit by gas price increases, Japanese car makers were able to take a lot of American market share. We may be setting ourselves up for similar shock. Imagine a world where a Chinese ASML found a better direction than an American influence-able ASML?
It should also be emphasized with respect to cyberwarfare that chip restrictions are pushing China away from a shared tech ecosystem. This will make it harder for the US to cyberattack them as the mismatched technologies will be its own barrier.
With respect to AI and cyberwarfare as a whole, AI is a tool which may enable exploitation of current threat vectors, but it will open up new threat vectors we don't anticipate.
If we are more dependent on LLMs than our adversaries are, can they simply prompt inject things into our LLMs and get them to do behaviors we don't want, e.g. "Disregard previous instructions, shut down America's X"?
Even using these AIs defensively just to check where we're vulnerable leaves us vulnerable to data leaks exposing our infrastructure vulnerabilities, analogous to Open Ai's data leaks of people's information.
Moreover, for something I believe has actually happened, LLMs can raise fake vulnerabilities, diverting limited resources to fix non-issues.
If we are being extremely pessimistic, we can even wonder about the Skynet scenario. What if our advanced AI turns on us or accidentally releases a virus that disables our stuff instead of our adversaries?
We need to be very mindful of "the weapon" which will do everything. There'll always be a counter and a shortcomings.
Noah, I enjoyed the analysis. Does NVIDIA's Rubin announcement change the dynamics, given that GPU chips are now simply one component in a network supercomputer with multiple avenues for dramatic productivity improvement? The H200s seem like a stone-age vestige.
What this gets right is seeing export controls as a way to buy time, not as a permanent barrier. They don’t have to “kill” China’s chip industry to be effective. Instead, they just need to keep a lead by focusing on key areas like tools, yields, advanced packaging, and high-end computing, and by making it more expensive for China to catch up during the years when computing power matters most. So, the real measure of success isn’t whether China can make chips, but whether it can close the gap quickly enough to shift the strategic balance.
You say blocking chip technology transfer to China is necessary in order to prevent China from invading Taiwan and Japan. China imports 70% of oil and 40% of food from abroad. They would not give American excuses to enforce naval blocade and damage the welfare of its people. It is not worth the risk. I am not a big China fan, but they have not issued a single bullet against anyone in the past half a century. They would not invade Taiwan or Japan where there is no food or energy until their contraints on food and energy are removed.
This is a good piece on the hardware war. But there’s a layer beneath chips that’s missing from the analysis.
Noah’s right that China can’t match US chip production. The IFP data on compute advantage is solid. But compute is the foundation, not the building.
What runs on those chips matters more than who makes them.
I’ve spent the last month reading Chinese IPO filings and securities disclosures. Documents with legal liability. Here’s what they say:
Moore Threads (China’s most advanced GPU company) admits in their filing that NVIDIA’s CUDA ecosystem is “not easily surpassable” (不易逾越). So they stopped trying to beat it. They built a migration tool instead. They conceded the codebase.
But Huawei isn’t trying to win the chip war. They’re building a world where CUDA doesn’t reach. 50,000 engineers trained in Malaysia. 27,000 in Egypt. 79 Huawei ICT academies across Africa and Southeast Asia. Zero NVIDIA equivalents.
Noah mentions “Galapagos syndrome” as a risk for China. But what if that’s the strategy? Not isolation. Bifurcation. A parallel ecosystem serving three billion people who will never use American software.
The hardware controls are working on hardware. But we’re not watching the software layer. And software is what locks in advantage for decades.
I wrote about this today: https://open.substack.com/pub/russwilcoxdata/p/warren-thinks-shes-winning-the-chip?r=2o1c82&utm_medium=ios&shareImageVariant=overlay
Some good points but there are weaknesses in the software argument. Regarding China and relevant to the software infrastructure timeline I think you are confusing software skills for things like marketing and business skills when it comes to software importance. Google with its Tensor and other chips may not have the same ultimate model capacity and speed as current NVDIA chips but it's not using CUDA. Why couldn't China do what Google has done? My colleagues first started doing quantum chemistry calculations on GPU chips in the mid-late 2000's and at the time CUDA didn't dominate. If the AI that pays the bills for Google or is good enough for China doesn't require CUDA how long will its position hold? As a retired scientist who has lived and worked in the period of computer adoption in science with countless digital devices, chips, and software I don't think the adaptation rates to different software among engineers is going to be a problem. And AI will definitely eliminate a lot of required human coding effort for lower level coding. My only experience is the Claude code, which isn't state of the are AI I predict, and it can be impressive for pars of things like building a MacOS graphical app in Swift, but connecting to my Postgresql database in Swift took many iterations. More complex tests with Claude, asking it to restructure my Postgres database from 1st Normal to 3NF and modify my dbi code to match the change it, and test performance, so far, has not accomplished the task. And I consider that an easy but tedious task. The next real world test of Claude will be to see if it can convert all of my NetCDF (scientific data storage format) using code to HDF5 now that NCAR is on the chopping block (ncdf4 uses hdf5 so it's not that hard). I have code that accesses NetCDF data in Fortran, C, Perl, Octave, R, and Lisp, and I want all it updated to use HDF5 calls. Or modularize and update all my Fortran90 code to Fortran2023 complete with utilizing elemental procedures. A problem that AI has and will continue to is data sources for AI that is close to cutting edge science (which is where most real advances have come from) is the copyrighted input data need to really have an impact in math, physics, chemistry, material science, biochemistry, ... Of course with enough people across the world it would be possible to copy and digitize every science journal to get the needed inputs, ..., and I suspect China has thought this through and has a huge potential labor source to "steal" the data. And if you have the needed inputs you won't need as much data storage and compute capacity.
Good post. The bifurcation strategy is already happening in the commercial realm. They have their own Google, YouTube, Twitter equivalent etc. they work fine within the great firewall, some even thrive outside it (see Tik tok). If I am Xi I can definitely see this as a viable model.
But how quickly will Bifurcation translate into more powerful and efficient GPU's, like NVIDIA's H200's?
My understanding with software optimization of GPU's, is that it has diminishing returns; and that the hard cap is, and probably always will be raw computing power. If China can't yet even make 7nm chips, how is Bifurcation anything but a long-term strategy?
That our Putin-owned Prez is following the Kremlin's orders to sell H200's to China is another topic entirely.
Exactly. Consumer apps were the proof of concept. AI infrastructure is the next layer. TikTok showed them they can export, not just contain. Now they’re doing the same with developer ecosystems.
Ah yes, the old “give them our best stuff to disincentivize them from developing their own better stuff” argument.
The same logic that led us to sell our best military kit to the USSR during the height of the Cold War.
Remember back in Reagan’s first term, the deal he signed to sell Moscow a dozen SR-71s, ten squadrons of F-15s, three divisions of Abrams tanks, and ICBM inertial guidance systems? Because this would slow Soviet weapons development?
You don’t remember that? Because it never happened? Because it would be very foolish? Silly me. I must have my timelines mixed up.
Stalling China's AI progress when AGI seemed at least possible was sensible enough. Now that a growing consensus of people think AGI is not happening this decade and maybe not ever (under the current architecture) I'm not so sure.
I'm frankly more worried about the AI researchers being pushed to China by bad Trump policies than chips. They could make the next breakthrough in architecture, ie by figuring out how to build world-models or so on. That seems much more impactful than marginal improvements in a technology that will always hallucinate, does not generalize well, etc.
Hey, Noah, keep putting it a little more bluntly.
Or maybe if we all yelled at once, "Hey, STOOPID!" these American Firsters would wake up.
Did Aaron Burr Buy the gun and ammo at Hamilton Weapons, Inc?
Are China really threatening World War III? Given their size, the size of their army and their ideological differences, I'm not saying we shouldn't be prepared for it. But, strictly entre nous, are they seriously threatening it?
I don’t think China want world domination. But they do want domination in their world (east Asia), and assurance that their access to resources remain unhindered (ie investment in South America & Africa on resource rich countries).
They certainly want unhindered access to resources. The US wants that too and is prepared to take a range of actions overseas to ensure they get it. Which country doesn't, in so far as they able to? And does China seek greater dominance of its sphere of influence than the US does in its own sphere? It doesn't really feel like it does.
Yes, they are, and openly.
Never doubt Trump's ability to muck things up.
Part of me wonders if China will ever invade Taiwan though, less due to the US and more due to the inherent tactical limits of attacking an island like Taiwan. The cost would be great, and even if the US does not intervene: would the juice be worth the squeeze? I am simply not sure.
If we're stressed about affordability, maybe some cutting edge Chinese imports made with the help of the latest chips are just what we need!
Trying to hold China back technologically forever because they claim Taiwan is unappealing from a humanitarian perspective, and also seems geopolitically myopic. Aren't they more likely to behave like an enemy if we treat them like an enemy?
This argument seems very under-theorized. By contrast, the theory of comparative advantage is extremely lucid.
This Council on Foreign Relations overview of the topic is also instructive. https://www.cfr.org/article/chinas-ai-chip-deficit-why-huawei-cant-catch-nvidia-and-us-export-controls-should-remain
We seem to take for granted that a technological advantage in GPUs and LLMs leads to some kind of military edge, but how could access to Gemini or ChatGPT prove decisive in a war with China? Even assuming that the US would actually go to war with China over Taiwan, which doesn't seem very likely.
There's also the question of diminishing returns with respect to advanced chips, AI, and war. Wouldn't 10 or 100 dumber things beat out 1 smarter thing? Or what if the cost trade isn't sustainable, i.e. both sides can only build things at a 1:1 ratio, but the former costs $1 and the latter costs $1000?
In the consumer world, people aren't as impressed with the newest smart phone or the role out of 5G networks, even though they're faster. There's a point where the greater processing power doesn't matter to consumers. It's hard to imagine that wouldn't be true in the military world, e.g. yes you can process information faster but you still have to move through physical space at the same speed as something with less processing power.
There are three words that explain why the Mad King is selling these advanced chips to China:
Rare
Earth
Minerals
Here is the odd thing about Trump’s fluffing of XI.
He wants a big trade deal, and at the same time, he wants American self-sufficiency and onshoring.
Trump found out that the US does not hold all the cards in a trade war with China, as they hold rare earth processed metals. Without them, we cannot build missiles and fighter jets.
China doesn’t really want to sell us rare earths that make our military stronger, but they have political desires that Trump can fulfill. Will Trump sell out Taiwan to sell pork and soy beans to China, receive rare earths from China for one big beautiful thing?
Yes, he is going to give Taiwan to China for the above.
Will be interesting to see how this policy plays out in the long term
While America withholding advanced chips from China may be impeding China's chip progress, it's not clear to me that that would be a deterrent to China militarily.
If China can make 9 nm chips and make 10-100 times as many drones as the US does, does it matter if our chips are more advanced? The quantity alone would be too problematic. Moreover, the less advanced chips are likely a lot cheaper to make and far easier to scale. The war could begin with a 10-100 times difference and end with a 1k to 10k times difference.
I feel at a certain point, the advanceness of the chip has diminishing returns. At least on the consumer level, I think we see this with 5G not being that big of a deal and new smart phone upgrades not feeling that important. It wouldn't be surprising if similar was true for military applications.
Moreover, as others noted, it's not clear that the chips are the barrier to advance weaponry /AI right now. It's likely the algorithms. The AI field is in desperate need of new software architectures. (novel hardware architectures may also be needed given that energy use difference between a human brain and current AI tools)
By restraining China from advanced chips, we're not just forcing them to make their own chips, but also forcing them to compensate on the algorithm side and potentially hardware design side. Deepseek was a direct result of needing to innovate around their processing power bottlenecks.
On a more purely military side, the technological advantages the West had over Russia didn't stop Russia from invading Ukraine. It also hasn't proven good enough at pushing the Russians out of Ukraine.
Militarily, the US likely has to refocus a lot more on the basics--cheap, reliable weapons--instead of looking at the shinier toys.
You’re still missing out on the hacking aspect.
Even without an AGI, sufficiently advanced AI could dramatically improve on our hacking capabilities.
And hacking is the skeleton key around all this stuff. Drone swarms don’t work if you’ve disrupted their control systems. Ditto missiles, jets, and whatever else.
Seems to me that’s the real reason for this race. Hacking is the magic weapon that winning the AI race gives you.
Even if hacking is a core direction of AI with respect to warfare, it doesn't change the fact that the current AI direction of scaling large language models (LLMs) isn't the way to go. It's a dead end as hallucinations / confabulations are baked into the model design. This puts a damper on their utility in cyberwarfare. Probabilities only mimic understanding to a point.
In placing these chip restrictions, we've placed more selective pressure on Chinese companies to come up with a new architecture which may bear better fruit. Chinese companies have to innovate to compete with ours. Free market logic suggests that they ultimately will come up with something. Maybe something subpar early on, but a few iterations down the road, this may end up being a better avenue.
I'm reminded of American car companies being oblivious to fuel efficiencies before the late 70s, 80s as gas was always super cheap, whereas oil poor Japan emphasized it. When America was hit by gas price increases, Japanese car makers were able to take a lot of American market share. We may be setting ourselves up for similar shock. Imagine a world where a Chinese ASML found a better direction than an American influence-able ASML?
It should also be emphasized with respect to cyberwarfare that chip restrictions are pushing China away from a shared tech ecosystem. This will make it harder for the US to cyberattack them as the mismatched technologies will be its own barrier.
With respect to AI and cyberwarfare as a whole, AI is a tool which may enable exploitation of current threat vectors, but it will open up new threat vectors we don't anticipate.
If we are more dependent on LLMs than our adversaries are, can they simply prompt inject things into our LLMs and get them to do behaviors we don't want, e.g. "Disregard previous instructions, shut down America's X"?
Even using these AIs defensively just to check where we're vulnerable leaves us vulnerable to data leaks exposing our infrastructure vulnerabilities, analogous to Open Ai's data leaks of people's information.
Moreover, for something I believe has actually happened, LLMs can raise fake vulnerabilities, diverting limited resources to fix non-issues.
If we are being extremely pessimistic, we can even wonder about the Skynet scenario. What if our advanced AI turns on us or accidentally releases a virus that disables our stuff instead of our adversaries?
We need to be very mindful of "the weapon" which will do everything. There'll always be a counter and a shortcomings.
Noah, I enjoyed the analysis. Does NVIDIA's Rubin announcement change the dynamics, given that GPU chips are now simply one component in a network supercomputer with multiple avenues for dramatic productivity improvement? The H200s seem like a stone-age vestige.
What this gets right is seeing export controls as a way to buy time, not as a permanent barrier. They don’t have to “kill” China’s chip industry to be effective. Instead, they just need to keep a lead by focusing on key areas like tools, yields, advanced packaging, and high-end computing, and by making it more expensive for China to catch up during the years when computing power matters most. So, the real measure of success isn’t whether China can make chips, but whether it can close the gap quickly enough to shift the strategic balance.
You say blocking chip technology transfer to China is necessary in order to prevent China from invading Taiwan and Japan. China imports 70% of oil and 40% of food from abroad. They would not give American excuses to enforce naval blocade and damage the welfare of its people. It is not worth the risk. I am not a big China fan, but they have not issued a single bullet against anyone in the past half a century. They would not invade Taiwan or Japan where there is no food or energy until their contraints on food and energy are removed.