One of, if not the most, historic computing predictions is celebrating its 50th birthday and proving to its critics once again that Gordon E. Moore was talking smack back in 1965.

That was when Moore, who went on to create and found chip maker Intel, penned a column in Electronics that the number of transistors on a computer chip would double every year while the cost of a chip would remain steady.

His prediction, long known as Moore's Law, remains the standard driving both technology and the industry forward.

"Integrated circuits will lead to such wonders as home computers-or at least terminals connected to a central computer-automatic controls for automobiles, and personal portable communications equipment," Moore wrote.

At the time, Moore was 36 years old and working as the research and development chief at Fairchild Semiconductor; he believed his prediction would be true for about a decade. He had no expectations it would still be holding fast as he reached his 86th birthday. When he wrote his now historic essay, the transistor was the size of a pencil eraser. Now, as one analogy notes, about 6 million of those transistors can fit in the size of a sentence period.

"In the beginning, it was just a way of chronicling the progress," Moore says in an Intel interview. "But gradually, it became something that the various industry participants recognized as something they had to stay on or fall behind technologically."

In financial terms Moore's Law, in regard to market value of tech companies benefiting from Moore's Law, hits $13 trillion last year.

But it's the technology advancement that best illustrates Moore's principle, notes Intel strategist Steve Brown.

"Ultimately, it won't be about making a better, faster smartphone," Brown said. "We may eventually discover how to make more food, create better living conditions and connect more people together. Moore's Law could be key to unlocking that."

As Moore recollects he wrote his perspective directly off his R&D efforts at the time.

"I took the opportunity to look at what had happened up to that time. This would have been in 1964, I guess. And I looked at the few chips we had made and noticed we went from a single transistor on a chip to a chip with about eight elements-transistors and resistors-on it," said the chairman emeritus of Intel.

He recalls noticing that new chips featured twice the number of elements (at the time that was 16) and in the lab he and his team were building chips with about 30 elements and even contemplating 60 elements.

"Well, I plotted these on a piece of semi-log paper starting with the planar transistor in 1959 and noticed that, essentially, we were doubling every year," he recalls. "So I took a wild extrapolation and said we're going to continue doubling every year and go from about 60 elements at the time to 60,000 in 10 years."

Even Moore seems bewildered looking back at what he was thinking at the time.

"I was just trying to communicate the point that this was the direction semiconductors were going," he said.

In 1975 Moore updated his prediction to a doubling every two years.

Who would have guessed that five decades later the chip industry is boasting chips crammed with 1.9 billion transistors (the Intel Broadwell dual core processor) and that Intel's high-end Xeon is packed with 4.3 billion transistors.

 "The industry has been phenomenally creative in continuing to increase the complexity of chips. It's hard to believe - at least it's hard for me to believe - that now we talk in terms of billions of transistors on a chip rather than 10s, hundreds or thousands," said Moore, acknowledging that there is no end in sight regarding such chip capabilities.

"It's a technology that's been much more open ended than I would have thought in 1965 or 1975. And it's not obvious yet when it will come to the end."

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion