A hypothesis published on 19 April 1965 still shapes computing today. We know it as Moore’s law.
Integrated circuits will lead to such wonders as home computers or at least terminals connected to a central computer automatic controls for automobiles, and personal portable communications equipment.
Integrated electronics will make electronic techniques more generally available throughout all of society, performing many functions that presently are done inadequately by other techniques or not done at all. The principal advantages will be lower costs and greatly simplified design…
If we look ahead five years, a plot of costs suggests that the minimum cost per component might be expected in circuits with about 1,000 components per circuit (providing such circuit functions can be produced in moderate quantities.) In 1970, the manufacturing cost per component can be expected to be only a tenth of the present cost.
The essay by Gordon Moore, head of research and development for Fairchild Semiconductor and future co-founder of Intel, ran in Electronics. He predicted that the integrated circuit performance, measured by the number of transistors on a chip, would double every 12 months for “at least” the next 10 years.
Gordon Moore’s 1965 forecast that the number of components on an integrated circuit would double every year until it reached an astonishing 65,000 by 1975 is the greatest technological prediction of the last half-century. When it proved correct in 1975, he revised what has become known as Moore’s Law to a doubling of transistors on a chip every two years (emphasis added).
In February 2020, Technology Review noted that “the most advanced chips today have nearly 50 billion transistors.”
That was before Covid-19 and the resultant supply chain challenges. Charles Leiserson, an MIT computer scientist, told Technology Review that the “rate of progress” had declined. Moore’s Law, he said, is “over. This year that became really clear.”
[T]he fabs that make the most advanced chips are becoming prohibitively pricey. The cost of a fab is rising at around 13% a year, and is expected to reach $16 billion or more by 2022. Not coincidentally, the number of companies with plans to make the next generation of chips has now shrunk to only three, down from eight in 2010 and 25 in 2002.
Yet earlier this month, Wired pointed out that Apple’s M1Ultra is a “Frankenstein’s monster” with 114 billion transistors.
As it becomes more difficult to shrink transistors in size, and impractical to make individual chips much bigger, chipmakers are beginning to stitch components together to boost processing power. The Lego-like approach is a key way the computer industry aims to progress. And Apple’s M1 Ultra shows that new techniques can produce big leaps in performance.
In 2020, Apple ended its reliance on Intel chips for Macs and MacBooks, announcing that it would design its own chips.