Moore's Law is Dead

The theory which has predicted the future of computing for decades is on its last legs.

Head Image
© 2017 Wikipedia Commons - Richard Wheeler

In 1965, Gordon E Moore, one of the co-founders of Intel, noticed a trend. In every year since the invention of the silicone chip, the number of integrated circuits that could cost-effectively fit on a single chip had doubled. This trend, he believed, would continue into at least the medium term future.

This doubling of the number of transistors in a given area of space on a silicon chip came to be known as Moore's Law, and in the last five decades since it was first laid out, has proven itself to be exceedingly accurate. Time and time again, the performance of silicon chips has doubled: an exponential growth gate which has driven the explosion of ever-present digital devices in the modern world.

But this time is coming to an end.

The latest generations of silicon chips being made by companies like Intel or ARM, feature transistors on the nano-scale. At these minuscule sizes the number of atoms in the transistor components is reduced to such an extent, that the probability of the transistor being in a corrected on or off state decreases. Put simply, there is a direct physical limit to the number of transistors that can be placed in a given area of silicon chip.

At the current stage manufacturers are trying to stack chips on top of each other in three dimensional structures in order to increase transistor density. As such, we are seeing an increase in so-called multi-core chips, such as Intel's i7 hexa-core processor. However, the problem with this approach is that it is expensive.

Primarily Moore's Law is an economic one, expressing the number of transistors that one can buy with a given amount of money. It is this dynamic that has enabled, for example, each new generation of smartphone to have increased processor power, without resulting in increased cost. With this in mind, even if manufacturers can increase the number of transistors in an area through the 3D stacking of chips, if they can't do it economically, it will not result in a continuation of Moore's Law.

Is the advancement of computing power no longer a reality?

So what then will the future hold? Is the continued advancement of computing power no longer a reality?

Despite the current slowdown in chip performance increases, there remain many other options for advancement of computing power. Designers are currently working on new chip architectures and manufacturing technologies which may enable a resumption of growth. Alternatively, while not using silicon chips, there are many groups that are working on revolutionary new forms of computing such as Quantum and Spintronic Computing, which would ignore the physical limits imposed by silicon technology.

It would seem that for the near term anyway, your smartphone will continue getting smarter.

Lower Image: [Wikipedia Commons - Julben]