BlogsWOW

How The Integrated Circuit Came To Be

For something so small, integrated circuits have done so much to revolutionize human civilization. These sophisticated electric circuits are what make modern electronics and computing technologies work, so it isn’t surprising to see them integrated into a wide range of electrical and electronic devices—from television sets and computers to cars and airplanes.

An integrated circuit is designed to be a single component which comprises an array of electric circuit components, including transistors, resistors, capacitors, and diodes. All these are built on the surface of a single chip or wafer of a semiconducting material, the most common of which is silicon.

Today, integrated circuits are manufactured on an industrial scale. The production process takes advantage of a range of physical and chemical techniques that are performed on a silicon wafer substrate. These include the following:

Central to these procedures is a technique called lithography, which is the process of creating 3-dimensional patterns on the substrate. Light is used to transfer this pattern from a mask to a light-sensitive photoresist on the substrate. Since integrated circuits are very small and may contain millions of transistors, instruments like microscopes equipped with very precise linear stages and motion control systems are used in the manufacturing and quality control of these products.

The Beginnings of the Integrated Circuit

Among the different components of an electric circuit, the most important is the transistor because it acts like a switch that can either be in an “on” or “off” state. This may seem very simple, but its ability to switch between 2 binary states is actually the basis of computing, the language used by computers to encode data and to process enormous amounts of information.

The invention of transistors at Bell Labs in 1947 was a huge advance in electronics. Before they became available, engineers had to rely on vacuum tubes, which were disadvantageous because they produced a lot of heat, were less durable, and were very large and bulky. As a case in point, ENIAC, the first electronic general-purpose computer, weighed 27 tons and occupied 167 square meters in volume. They needed a very big room just to contain this computing device, which used almost 17,500 vacuum tubes.

However, creating smaller-sized transistors wasn’t enough. The thing is, developing machines with very complex circuitry like computers required a huge amount of wires and other electrical components to connect everything together. It was simply quite impossible to produce such an intricate system on a small scale without things going on the blink.

The Breakthrough

A breakthrough came in September 1958, when a Texas Instruments employee by the name of Jack Kilby successfully demonstrated the idea of integrating all components of an electric circuit on “a body of semiconductor material.” This material was germanium. Just a few months later, future Intel co-founder Robert Noyce introduced an even better integrated circuit, which unlike, Kilby’s, was engineered into a silicon chip.

Initially, integrated circuits were used in devices and equipment manufactured for the United States Armed Forces and for the National Aeronautics and Space Administration. Eventually, however, semiconductors became increasingly used even in ordinary commercial products, including calculators, personal computers, and, decades later, smartphones and other handheld devices. Today, these compact but complex electric circuits are at the very core of what makes our modern, digital world run like clockwork.

Exit mobile version