Who Will Win the “Legacy Chip War”? Let’s hope it’s not China.

The computer chip shortage that took place at the end of 2020 made everyone realize that the most cutting-edge, or advanced, computer chips are no longer made in the United States. This is a problem because it means the country relies on others for these important parts. What’s interesting is that this shortage was mostly about a lack of “legacy chips”.

These are the types of chips that American companies still make, but not enough to meet the demand from manufacturers in the U.S. This lack of chips hurt not just traditional industries like car-making, but also any devices that use a range of chip technologies to do things like control displays, play sound, run engines and other key tasks. Without these parts, there were major disruptions in the U.S. economy, which made people take a closer look at the importance of these “legacy” chips.

So, what are these “legacy” chips? They’re a type of computer chip made using well-known but still improving methods. These chips are usually larger in size. The CHIPS and Science Act of 2022 refers to legacy devices as those made with 28-nanometer (nm) technology or larger. The exact definition of other types of chips is still being worked out. The term “cutting-edge” chip also doesn’t have a clear definition yet, but is probably for those made using techniques at or below 5 nm. There’s some uncertainty about where chips made with advanced 10 nm and 7 nm techniques fit in. However, what’s considered “legacy” or “cutting-edge” changes over time, as technology continues to advance at a rate predicted by Moore’s Law.

Legacy chips are everywhere. They’re used in everything from cars and planes to household appliances, broadband, consumer electronics, factory systems, military equipment, and medical devices. They play a key role in U.S. manufacturing, so any disruption in their supply has a big impact on manufacturing and the wider economy.

Even though they’re called “legacy”, these chips aren’t old or outdated. They’re continually updated to meet new needs and are used in lots of different ways. For example, some are made with silicon carbide, a material that’s expected to be important for reducing carbon emissions. Legacy chips are likely to remain important for new industries and technologies for a long time to come.

Thinking of legacy chips as separate from cutting-edge chips based on their size might limit our understanding of their importance. The term “legacy” comes from a time when the military was a big driver of chip development. Recent actions to limit China’s ability to make advanced chips – forcing them to focus on mainstream chips – might be missing a big risk.

If the U.S. wants to protect its economy from the effects of China’s industrial policy, it can’t simply think of chips as “advanced” or “less advanced” based on their size. Instead, decision-makers need to think more carefully about the importance of special legacy chips and how to support their production and ongoing development.

The shortage of these computer chips has messed up the operations of U.S. car makers and other manufacturers. For example, companies that make medical devices had trouble because they couldn’t get enough of the older-style chips they needed to power their machines. The lack of chips forced companies to scramble and buy electronic parts on the spot market, which made it hard for patients to get certain important devices.

The chip shortage also hit companies that make electronic consumer products. In 2021, Apple had to cut back on its plans to make iPhones and other devices because there weren’t enough chips. With the iPhone 13 Pro Max, which is a really complex device with over 2,000 parts, the problem wasn’t a lack of the most advanced chips. Instead, the production was held up because of a lack of really cheap parts like power management chips made by Texas Instruments, transceivers from Nexperia, and connectivity devices from Broadcom. What’s important to remember is that these kinds of chips are used not just in iPhones or smartphones or even just consumer electronics, but in computers, data centers, home appliances, and cars that are connected to the internet.

A big reason for the chip shortage is that not enough money has been invested into making more of these older-style chips. The reason for this is pretty simple: the returns, or profits, are much lower when you invest in making these chips. Right now, only about one-sixth of all investment in semiconductors goes towards making these older-style chips. Despite this risk, several big chip makers like Infineon, Analog Devices, Texas Instruments, and NXP Semiconductors are now looking to invest more into making these higher-node chips.

An unintended consequence of U.S. export controls on advanced chip technology to China may be a wave of state-backed investment leading to overproduction and, potentially, Chinese dominance of global legacy chip production.


Another big risk for U.S. companies thinking about investing in older-style chips is China. The Chinese government recently said it’s going to put $143 billion into its chip industry. In 2021 and 2022, China is expected to have spent $12.3 billion and $15.3 billion respectively on developing its semiconductor industry, which would make up 15% of the worldwide total. If China can get the necessary manufacturing equipment, it’s expected to almost double its ability to produce these computer chips over the next 10 years, which would make up about 19% of the world’s total production.

Because of restrictions from the West on sharing advanced chip technologies with China, most of China’s new investments will probably be in making older devices (those that are 28 nm and above). One unexpected result of the U.S. limiting China’s access to advanced chip technology might be a huge wave of investment from the Chinese government, leading to overproduction and possibly, China taking over global production of these older-style chips.

U.S. experts are already pointing out the risk of China potentially dominating in this way.