Moore’s Law is the prediction that the number of transistors in a dense integrated circuit doubles every two years as technological progress advances. The observation was made by Gordon Moore, co-founder of Intel, who saw that the size of transistors was shrinking rapidly due to continuous innovation.
History of Moore’s Law
Gordon Moore, CEO of Fairchild Semiconductor and later co-founder of Intel, predicted that the number of transistors in a dense integrated circuit (IC) would double every year for the next decade, based on the economics of the integrated circuit. Moore wrote: “The cost per component is nearly inversely proportional to the number of components,” so the more the number of transistors, the lower the cost per transistor.
Gordon Moore’s prediction held for 10 years, later becoming Moore’s Law. He then revised his prediction and stated that the number of transistors would double every two years moving forward.
Moore’s Law was applied widely during the decades, making more or less accurate predictions about the number of transistors that could fit on a single integrated circuit. However, many computer scientists, including Moore himself, predicted that the law was coming to an end.
Over the last few years, the growth of the number of transistors on each IC is declining, falling much lower than what Moore’s Law predicts.
Moore’s initial projection was for the decade 1965-1975, as shown in the graph below:
Following Moore’s prediction, the graph below shows the evolution in the number of transistors in real-time:
As the graph shows, the number of transistors was doubling approximately every two years over 1965-2014, with the growth slowing down beginning in 2015. Some of the biggest chip manufacturers contributing to the growth are Intel, Samsung, Qualcomm, and AMD.
Applications of Moore’s Law
Over the last five decades, the law’s been widely used in the semiconductor industry, making contributions significant economically, technologically, and socially.
It’s been used by the semiconductor industry to set R&D targets and plan production for the long term.
It’s been linked to the growth of digital electronics, and the advent of smaller, more powerful electronic devices.
It’s made computing relatively inexpensive and affordable, providing societal, economic, and technological benefits.
It’s kept the semiconductor industry in sync with developments and technological progress, with almost every company applying the law.
Moore’s Second Law
While Moore’s Law states that the cost of the computer (for consumers) falls as more components are fitted in an integrated circuit, Moore’s Second Law is that the capital cost of manufacturing integrated circuits increases exponentially over time. In other words, the costs of R&D, manufacturing, and testing rise exponentially with each new generation of chips.
Moore’s Second Law plays an important role in the sustainability of Moore’s Law. As the costs of innovation and manufacturing increase, companies are likely to reduce the rate at which they advance technologically, and the number of transistors is likely to be lower than what is predicted by Moore’s Law.
The Future of Moore’s Law
As the costs of R&D increase and companies slow down their rate of innovation, Moore’s Law seems to be coming to an end. In 2015, Gordon Moore himself stated, “I see Moore’s Law dying here in the next decade or so.”
Companies in the industry are now slowly shifting their focus towards non-silicon computing, such as quantum computing, AI-based chips, and application-specific-integrated circuits (ASIC). They are new developments in the computing industry and may yield large efficiency gains in the near future.
CFI offers the Commercial Banking & Credit Analyst (CBCA)™ certification program for those looking to take their careers to the next level. To keep learning and developing your knowledge base, please explore the additional relevant resources below: