The law theorized (empirically) 55 years ago seems nearing its end. What are the possible alternatives? If today we can do everything with a simple app, which we run on a small and light smartphone that we carry in our pockets, it is because Moore’s law has worked quite well for 55 years. It was very distant in 1965 when the American computer scientist Gordon Moore, who would have founded Intel only three years later together with Robert Noyce, predicted that the number of transistors integrated into a single microchip would double every 18 months.
Fifty-five years later, we can confirm that Moore’s law has not only turned out to be spot on but even conservative because the number of transistors in a chip over many years has increased at even greater rates. But for some time now, there has been talk of the end of Moore’s law, and it is spoken about with great concern. It is increasingly difficult to pack an ever-increasing number of transistors into a microprocessor. But what if Moore’s Law actually “stops working”?
Because Moore Was Right
When Moore formulated his law in 1965, which was strongly empirical but which turned out to be almost a prophecy, he did so based mainly on economic assumptions. Moore, who worked at the time at Fairchild Semiconductor (which he co-founded), knew that increasing the number of transistors in a chip was critical to improving its economic competitiveness. Being able to put more transistors in a chip boosted its power and its economy.
In 1965, then, it was a long way from the physical limits that, several decades later, would have made it challenging to continue to create chips with more and more transistors inside them. For decades, therefore, Intel and other companies that produced microprocessors developed increasingly advanced production processes that allowed them to create smaller transistors and, above all, to position them ever closer to each other.
The first accurate mass microprocessor of Intel, the Intel 4004, was built with a production process at 10 micrometers, that is, 10,000 nanometers, the unit of measurement of the production processes of modern chips. In 2002 Intel produced the Pentium 4 Extreme Edition at 90 nm (nanometers). In 2006 it had the Core i7 at 45 nm. Since May 2019, it has made the Core i3, i5, and i7 with Ice Lake architecture at 10 nm. 3 nm production processes are on the horizon, which should debut towards 2023.
Because Moore’s Law Is About To End
Today, dissimilar to 1965, we are exceptionally near the actual theoretical limits that prevent the hardware business that we have seen in these 55 years of Moore’s regulation. From one perspective, it is progressively troublesome and costly to make creation processes “printing” chips with numerous semiconductors inside them. Then again, in any case, the nearer the semiconductors are, the more gamble the electrons “bounce” from one association of the semiconductors to the next, creating estimation mistakes and, hence, debasement of framework execution.
In 2016, therefore, the semiconductor business affiliation (SIA) started to formally conjecture the need to beat Moore’s regulation, or at least, to dial back the quantitative improvement of chips (an ever-increasing number of semiconductors) and, on second thought, contemplate a subjective result of the product ( better utilize existing chips).
What Will Happen After Moore’s Law?
On a theoretical level, if Moore’s law stops working and microprocessors don’t continue to grow in power with the speed seen to date, the world won’t collapse. There is ample evidence that much of all this chip processing capacity today is wasted due to poor software programming. Many applications written for today’s computers, for example, are registered with the Python programming language, invented in the early 1990s. The advantages of Python are many:
- It is pretty simple to use.
- It is cross-platform.
- It is available in different implementations for various operating systems.
But the performance of applications written in Python is significantly lower than that of equivalent software written, for example, in C language. This means that today we waste much of the computing power of our processors to process instructions and not data .and that, if we woke up tomorrow in a world all programmed in C, we would not recognize it for how fast it would go. All of this, however, is easier said than done.
The Post-Moore Problems
In 2016, for this reason, the semiconductor industry association (SIA) began to officially hypothesize the need to overcome Moore’s law, that is, to slow down the quantitative development of chips (more and more transistors) and instead think about a qualitative product of the software ( better use existing chips).
What Will Happen After Moore’s Law?
On a theoretical level, if Moore’s law stops working and microprocessors don’t continue to grow in power with the speed seen to date, the world won’t collapse. There is ample evidence that much of all this chip processing capacity today is wasted due to poor software programming. Many applications written for today’s computers, for example, are registered with the Python programming language, invented in the early 1990s. The advantages of Python are many:
- It is pretty simple to use.
- It is cross-platform.
- It is available in different implementations for various operating systems.
But the performance of applications written in Python is significantly lower than that of equivalent software written, for example, in C language. This means that today we waste much of the computing power of our processors to process instructions and not data .And that, if we woke up tomorrow in a world all programmed in C, we would not recognize it for how fast it would go. All of this, however, is easier said than done.
The Post-Moore Problems
It is likely that, at some point, these new dedicated processors will prevail over GPUs (which, we remember, are born for video games and adapt to AI). All this will do nothing but concentrate the market for artificial intelligence algorithms in the hands of very few companies that, at the same time, write the software and produce the hardware. Indeed not an ideal situation if we think that the next decades will be based on artificial intelligence algorithms and big data processing.
Also Read: What Is Machine Learning, The Role Of Artificial Intelligence