SAN FRANCISCO — On April 19, 1965, just over 50 years ago, Gordon Moore, then the head of research for Fairchild Semiconductor and later one of the co-founders of Intel, was asked by Electronics Magazine to submit an article predicting what was going to happen to integrated circuits, the heart of computing, in the next 10 years. Studying the trend he’d seen in the previous few years, Moore predicted that every year we’d double the number of transistors that could fit on a single chip of silicon so you’d get twice as much computing power for only slightly more money. When that came true, in 1975, he modified his prediction to a doubling roughly every two years. “Moore’s Law” has essentially held up ever since — and, despite the skeptics, keeps chugging along, making it probably the most remarkable example ever of sustained exponential growth of a technology.
For the 50th anniversary of Moore’s Law, I interviewed Moore, now 86, at the Exploratorium in San Francisco, at a celebration in his honor co-hosted by the Gordon and Betty Moore Foundation and Intel. I asked him what he’d learned most from Moore’s Law having lasted this long.
“I guess one thing I’ve learned is once you’ve made a successful prediction, avoid making another one,” Moore said. “I’ve avoided opportunities to predict the next 10 or 50 years.”
But was he surprised by how long it has been proved basically correct?
“Oh, I’m amazed,” he said. “The original prediction was to look at 10 years, which I thought was a stretch. This was going from about 60 elements on an integrated circuit to 60,000 — a thousandfold extrapolation over 10 years. I thought that was pretty wild. The fact that something similar is going on for 50 years is truly amazing. You know, there were all kinds of barriers we could always see that [were] going to prevent taking the next step, and somehow or other, as we got closer, the engineers had figured out ways around these. But someday it has to stop. No exponential like this goes on forever.”
But what an exponential it’s been. In introducing the evening, Intel’s C.E.O., Brian Krzanich summarized where Moore’s Law has taken us. If you took Intel’s first generation microchip, the 1971 4004, and the latest chip Intel has on the market today, the fifth-generation Core i5 processor, he said, you can see the power of Moore’s Law at work: Intel’s latest chip offers 3,500 times more performance, is 90,000 times more energy efficient and about 60,000 times lower cost.
To put that another way, Krzanich said Intel engineers did a rough calculation of what would happen had a 1971 Volkswagen Beetle improved at the same rate as microchips did under Moore’s Law: “Here are the numbers: [Today] you would be able to go with that car 300,000 miles per hour. You would get two million miles per gallon of gas, and all that for the mere cost of 4 cents! Now, you’d still be stuck on the [Highway] 101 getting here tonight, but, boy, in every opening you’d be going 300,000 miles an hour!”
What is most striking in Moore’s 1965 article is how many predictions he got right about what these steadily improving microchips would enable. The article, entitled “Cramming More Components Onto Integrated Circuits,” argued that: “Integrated circuits will lead to such wonders as home computers — or at least terminals connected to a central computer — automatic controls for automobiles, and personal portable communications equipment. The electronic wristwatch needs only a display to be feasible today. ... In telephone communications, integrated circuits in digital filters will separate channels on multiplex equipment. [They] will also switch telephone circuits and perform data processing.”
Moore pretty much anticipated the personal computer, the cellphone, self-driving cars, the iPad, Big Data and the Apple Watch. How did he do that? (The only thing he missed, I jokingly told him, was “microwave popcorn.”)
“Well,” said Moore, “I had been looking at integrated circuits — [they] were really new at that time, only a few years old — and they were very expensive. There was a lot of argument as to why they would never be cheap, and I was beginning to see, from my position as head of a laboratory, that the technology was going to go in the direction where we would get more and more stuff on a chip and it would make electronics less expensive. ... I had no idea it was going to turn out to be a relatively precise prediction, but I knew the general trend was in that direction and had to give some kind of a reason why it was important to lower the cost of electronics.”
Can it continue? Every year someone predicts the demise of Moore’s Law, and they’re wrong. With enough good engineers working on it, he hoped, “we won’t hit a dead end. ... It’s [a] unique technology. I can’t see anything really comparable that has gone on for this long a period of time with exponential growth.”
But let’s remember that it was enabled by a group of remarkable scientists and engineers, in an America that did not just brag about being exceptional, but invested in the infrastructure and basic scientific research, and set the audacious goals, to make it so. If we want to create more Moore’s Law-like technologies, we need to invest in the building blocks that produced that America.
Alas today our government is not investing in basic research the way it did when the likes of Moore and Robert Noyce, the co-inventor of the integrated circuit and the other co-founder of Intel, were coming of age.
“I’m disappointed that the federal government seems to be decreasing its support of basic research,” said Moore. “That’s really where these ideas get started. They take a long time to germinate, but eventually they lead to some marvelous advances. Certainly, our whole industry came out of some of the early understanding of the quantum mechanics of some of the materials. I look at what’s happening in the biological area, which is the result of looking more detailed at the way life works, looking at the structure of the genes and one thing and another. These are all practical applications that are coming out of some very fundamental research, and our position in the world of fundamental science has deteriorated pretty badly. There are several other countries that are spending a significantly higher percentage of their G.N.P. than we are on basic science or on science, and ours is becoming less and less basic.”
How did he first get interested in science, I asked?
“My neighbor got a chemistry set and we could make explosives,” he said. “In those days, chemistry sets had some really neat things in them, and I decided about then I wanted to be a chemist not knowing quite what they did, and I continued my work in a home laboratory for some period of time. Got to the point where I was turning out nitroglycerin in small production quantities and turning it to dynamite. ... A couple ounces of dynamite makes a marvelous firecracker. That really got my early interest in it. You couldn’t duplicate that today, but there are other opportunities. You know, I look at what some of my grandkids are doing, for example, those robotics and the like. These are spectacular. They’re really making a lot of progress.”
Looking back on Moore’s Law and the power of computing that it has driven, I asked Moore what he thought was its most important contribution over the past 50 years.
“Wow!” he said. “You know, just the proliferation of computing power. We’ve just seen the beginning of what computers are going to do for us.”
How so?
“Oh, I think incrementally we see them taking over opportunities that we tried to do without them before and were not successful,” he added. “It’s kind of the evolution into the machine intelligence, if you wish, and this is not happening in one step. To me, it’s happening in a whole bunch of increments. I never thought I’d see autonomous automobiles driving on the freeways. It wasn’t many years ago [they] put out a request to see who could build a car that could go across the Mojave Desert to Las Vegas from a place in Southern California, and several engineering teams across the country set out to do this. Nobody got more than about 300 yards before there was a problem. Two years later, they made the full 25-mile trip across this desert track, and which I thought was a huge achievement, and from that it was just a blink before they were driving on the freeways. I think we’re going to see incremental advances like that in a variety of other areas.”
Did he worry, I asked Moore, whose own microprocessors seemed as sharp as ever, that machines would really start to replace both white-collar and blue-collar labor at a scale that could mean the end of work for a lot of people?
“Don’t blame me!” he exclaimed! “I think it’s likely we’re going to continue to see that. You know, for several years, I have said we’re a two-class society separated by education. I think we’re seeing the proof of some of that now.”
When was the moment he came home and said to his wife, Betty, “Honey, they’ve named a law after me?”
Answered Moore: “For the first 20 years, I couldn’t utter the terms Moore’s Law. It was embarrassing. It wasn’t a law. Finally, I got accustomed to it where now I could say it with a straight face.”
Given that, is there something that he wishes he had predicted — like Moore’s Law — but did not? I asked.
“The importance of the Internet surprised me,” said Moore. “It looked like it was going to be just another minor communications network that solved certain problems. I didn’t realize it was going to open up a whole universe of new opportunities, and it certainly has. I wish I had predicted that.”
No comments:
Post a Comment
Please leave a comment-- or suggestions, particularly of topics and places you'd like to see covered