A microchip timeline from 1959 to the supply chain shortage

The microchip supply chain shortage has stalled the electronics & manufacturing sectors. Here’s a timeline of the most pivotal 20th century invention

The supply chain crisis from COVID-19 caused lockdowns across sectors, including microchip manufacturing factories. Microchips are essential for most modern electronics - you’re using one to read this article - and the supply chain delay has caused problems for manufacturers. 

But where did the microchip originate from, what are they used in and what will come next in the electronics evolution? After all, necessity is the mother of invention and the impact of COVID-19 has caused the world to think outside of the box. As desperate Russian soldiers search through bombed Ukrainian homes for microchips to power their weapons, engineers look for ways to advance the essential microchip to prevent another supply chain crisis.

This timeline will explain more, but to prevent confusion, here is some terminology around chips, none of which are potato-related food items:


Microchip ‘chips’

Also known as microcircuits, integrated circuits and semiconductor component products. Microchips are silicon chips which contain a small integrated circuit, divided into wafers, which provide information. 

Semiconductor

Semiconductors are used in televisions and other electronics and conduct electricity.

Transistors

Made from silicon, a chemical element in sand, these are small switches which can do the job of a human brain: think and remember things.

1959 - The microchip is invented 

Computers in the 1950s were huge and very expensive. Tiny computer parts had to be wired together in what was known as the ‘tyranny of numbers’. 

Engineer Jack Kilby, from electronics firm Texas Instruments, was frustrated by this and decided to find an answer. He created one chip, which could do the job, without the wires. 

At the Institute of Radio Engineers’ annual trade show in New York, the company showed off its new device.


1961 - One expensive step

In the early half of the 1960’s, the cost of a microchip was US$31. The United States Air Force used them to manufacture missiles and NASA used microchips in the Apollo projects.


1965 - Moore’s Law

Gordon E. Moore, the co-founder of Intel, said that the number of transistors on a microchip doubled every two years, although the cost of computers was halved. 

This observation, that growth is exponential, suggested that the capability of computers would increase every few years, but they would be cheaper to make. 


1971 - Mass production of microchips lowers the cost

Moore was correct. Due to interest from the American government in the potential of microchips, mass production began and the cost of a microchip reached US$1.25. 

Fred Kaplan, author of 1959: The Year Everything Changed, explained: “It was the government that created the large demand that facilitated mass production of the microchip.”


1986 - the USA and Japan sign the Semiconductor Agreement

One thing Moore did not take into account with this law, was international interest and overseas competition, which soon caught up with microchip manufacturing. 

In 1986, the USA and Japan agreed to the Semiconductor Agreement, a government-supported cartel which fixed prices for microchip manufacturing.


1998 - the first microchip is implanted in a person

Professor Kevin Warwick, cybernetics director at the University of Reading, became the first human to have a microchip implanted in their body. The procedure took 20 minutes and the experiment lasted one week.

Warwick reported that smart-card activated doors opened for him and lights blinked around him.


2021 - China catches up

Chinese engineers built their own transistor shortly after Texas Instruments, but the Cultural Revolution swallowed their work. 

When China’s economy opened up in the 1980s, manufacturing firms were far behind the rest of the world. 

Yet working around the various lockdowns, in May of 2021, Chinese companies produced 29.9 billion chips. 


2022 - Chip shortage bites

The Chinese government forced chipmaking hub and manufacturing megacity Shanghai into a lockdown to stop the spread of COVID-19. 

Chip manufacturers, including Semiconductor Manufacturing International Corp., have found it difficult to keep up with demand due to the restrictions and shortages have gripped companies such as Apple.

“The economic loss will be immense,” said Richard Yu, Executive Director at Huawei Technologies Co. on a WeChat post.


May 2022 - Russian soldiers loot chips from Ukrainian washing machines due to shortage

Gina Raimondo, Commerce Secretary, said she had heard reports that sanctions on Russia have forced soldiers to loot computer chips from dishwashers in Ukrainian homes, to power military equipment.

“We have reports from Ukrainians that when they find Russian military equipment on the ground, it’s filled with semiconductors that they took out of dishwashers and refrigerators,” said Raimondo.


The future of microchips

It is likely that the future of microchips will involve embedding them in willing humans, for the purposes of communication, safety and medical care.

Share

Featured Articles

Manufacturing a legacy of safety, sustainability, and skill

Michael Vale, Group President for 3M’s safety & industrial business, explores diversity & environmental stewardship in manufacturing

5 minutes with: Simon Michie, Pulsant CTO

Simon Michie, CTO at Pulsant, explains why edge computing will transform manufacturing operations, but success will depend on having partnerships in place

Microsoft’s Çağlayan Arkan explores the supply chain

Çağlayan Arkan, Microsoft’s VP Global Strategy & Sales Lead for Manufacturing & Supply Chain, gives his take on digital factories and ‘the art of possible’

Elisabeth Brinton on the Microsoft Cloud for Sustainability

Technology

Aiimi’s Head of Solution Engineering Matt Eustace on risks

AI & Automation

5 minutes with Nicolai Peitersen, co-founder of Wikifactory

Procurement & Supply Chain