Chip Wars Review: The Untold Battleground of the Digital Age



Most history textbooks contain a chapter on the Cold War; in the midst of all the spy drama and existential horror, these textbooks only include one or two paragraphs about the most important development of the Cold War: the rise of the digital age. The two sides of the Cold War knew that the rapidly developing semiconductor industry was key to their success; as the Cold War entered its final chapter, technology also became an important part of the transition to globalization.

The semiconductor industry, like the rest of the computing world, was born from the mind of Alan Turing; however, the industry's path from government project to societal necessity was not readily apparent. Since the first computers depended on unreliable, inefficient vacuum tubes, they remained feasible only for the government or large government-funded labs. However, this all changed when scientists at Bell Labs invented the transistor at the end of 1947; now, the fundamental technology of computing was highly scalable, customizable, and shrinkable. The American government redoubled its efforts to create a transistor-based computer, and the digital age was born; Soviet spies soon caught wind of this new, powerful invention, and the race for technological dominance began. American firms like Fairchild Semiconductors and Texas Instruments (TI), with funding from the American government, started ruthlessly optimizing and innovating under the guide of visionaries like Bob Noyce, Morris Chang, and Gordon Moore; while the Soviets tried to steal American technology and replicate it, they remained far behind American firms, which were beginning to focus increasingly on selling their products in a more general consumer setting.

As more computer firms began popping up throughout the U.S., making both semiconductors and the machinery required to make them, firms began selling chips to run dishwasher instead of tanks. Fairchild founders Bob Noyce and Gordon Moore, tired with meddling from investors and the board, left to found Intel; their choice of headquarters on the West Coast would cement the position of Silicon Valley in the decades to come. Intel started out pioneering memory chips; however, Japanese firms such as Sony entered the market with much cheaper (albeit less reliable) memory chips and protective tariffs from the Japanese government, forcing Intel to pivot to making CPU chips. Intel faced extreme hardship during this painful transition period, but it was guided ahead by the constant paranoia of Intel CEO Andy Groves, whose constant fear of falling behind pushed him to innovate and rethink ruthlessly; Grove's paranoia was key to making Intel a giant in the chip industry.

Meanwhile, China's Mao Zedong, who wanted to enter the technological age without sacrificing the Communist ideal, purged the country of great scientists to create a "utopian" vision where each person made semiconductors in his home; his initiative only succeeded in setting back the Chinese economy by decades, though places like Hong Kong attracted American firms' assembly facilities due to China's lower wages and higher productivity. Taiwan, not wanting to be relegated to the failure of China and Russia, immediately integrated itself into the semiconductor process by inviting TI's Morris Chang to build Taiwan's semiconductor industry. Chang created Taiwan Semiconductor Manufacturing Company (TSMC) as a global manufacturing hub; TSMC would only manufacture chips from other companies, allowing it to specialize and become a world leader while removing the high cost of manufacturing from other semiconductor firms. Though companies were hostile at first, over 90% of the world's most advanced semiconductors are now made in Taiwan, most of which come from TSMC. Other countries followed Taiwan's lead; South Korea's Samsung usurped Japanese firms, which suffered from a severe economic downturn in the 1990's, as the largest producer of memory chips, the Dutch company ASML prospered by focusing on making the cutting edge machinery required to produce semiconductors, and a post-Mao China introduced the world to Huawei.

Huawei introduced the dangers of the newly globalized world; as its dominance in cheap integration of cell-phone networks grew, governments around the world began to ban it, as they suspected the Chinese government of using Huawei's technology to spy on other countries. The U.S., alarmed at the potential rise of a vast spy network on domestic soil, used its vital position in the semiconductor supply chain to cut Huawei off from vital American components which it needed to produce cutting edge technology. Today, Huawei lags far behind the rest of the world, and the U.S. is confident that its place in the semiconductor supply chain allows it to control the rise and fall of different companies. Even now, the semiconductor industry is being revolutionized; new companies like Nvidia which focus entirely on chip design and outsource all chip fabrication to TSMC are rising, while older companies like Intel, which still fabricate chips in-house, are beginning to struggle. Whatever the future brings, it will likely be entirely novel and unexpected, just as the past innovations of giants like Bob Noyce, Morris Chang, and Andy Groves were.

Comments