Pando

As Moore's Law turns 50, is there any way to save it from dying? Is it worth saving?

By David Holmes , written on April 21, 2015

From The News Desk

April 19, 1965: Before Facebook, before Apple, and even before Intel, there was Fairchild Semiconductor.

Launched in 1957 by the so-called "Traitorous Eight" -- who left Shockley Labs after the unhinged paranoia of its founder William Shockley became too much for his engineers to take -- Fairchild was the prototype for every Silicon Valley firm of the past five decades. Along with inventing the integrated circuit, which has arguably done more to shape modern living than any other invention of the past hundred years, the company's corporate structure -- or lack thereof -- would become the template for the modern startup. The founders dispensed with traditional hierarchies allowing for an open working relationship, and in a move that was unheard of at the time, every employee was given stock options. Moreover, like similar alums from PayPal, Netscape, Oracle, and Facebook, the "Fairchildren" constituted the first Silicon Valley "Mafia," with its principals going on to found or fund a staggering number of seminal tech companies and venture firms including Intel, Intersil, and Kleiner Perkins, which in turn made key investments in Google, Amazon, Netscape, Twitter, Genentech, and Sun Microsystems.

One of these early "Fairchildren" was Gordon Moore, the company's head of R&D. On April 19, 1965, Moore contributed an article to the relatively obscure trade journal Electronics called "Cramming More Components onto Integrated Circuits." Nothing too monumental-sounding there. Of course had Moore written the piece for the Internet with some fresh-faced millennial at Business Insider rewriting his headlines, the article would have been called, "Genius Electronics Pioneer Brilliantly Predicts the Course of Technological Progress Over the Next 50 Years." The only difference between that and a real BI headline is it isn't much of an overstatement.

Moore theorized that the number of transistors that could fit on an integrated circuit atop a Silicon chip would double every two years. His prediction, which was later immortalized as "Moore's Law," would turn out to be so accurate that for decades chipmakers relied on these projections when making long-term strategic decisions. Moreover, the theory posits an inverse relationship between cost and computational power. And as this trend of "more power for lower costs" played out -- not just in Moore's head but in reality -- it led to an exponential growth rate of technological progress and the ability to build massive companies extraordinarily quickly. In short, these dynamics laid the groundwork for the entire startup economy.

But while Moore's Law has had a nice run, exponentially higher growth rates at increasingly lower costs couldn't last forever. In truth, the law's demise has been on the horizon for years -- if it isn't dead already. In 2012, Purdue University created a transistor the size of an atom. That's not only the smallest transistor ever built, but the smallest that theoretically can be built. (That is, until you start exploring the possibilities of quantum computing, but that's a rabbit-hole for another discussion).

Even Moore himself predicted this year that his law would die "in the next decade or so."

But what do people mean exactly when they say Moore's Law is dead or dying? And what consequences would its death have on the global economy, technological progress, and our quality of life?

To commemorate the 50th anniversary of Moore's Law over the weekend, I spoke to Alex Lidow, the CEO of Efficient Power Conversion or EPC. He's made it his life's work to prolong the lifespan of Moore's Law. How? As Intel and others have found, traditional chip technology which relies on silicon is approaching a ceiling -- pretty soon, somebody is going to make a silicon chip that is as cheap and powerful as that material allows.

"The numbers," Lidow tells me, "instead of doubling every two years are improving in the 10 percent to 30 percent range."

Therefore the potential for growth and innovation of any hardware industry that relies on these chips -- from laptops to mobile phones to connected home devices -- will inevitably slow, as will the flow of capital to many startups.

But Lidow says he's found a semiconducting material that is superior to silicon in many ways: gallium nitride (GaN). Both in laboratories and in practice, GaN chips have outperformed silicon in a number of use cases and are also cheaper to manufacture, building on the infrastructure required to make silicon chips while being more resilient and requiring fewer protective elements. And while GaN technology is still years away from replacing the chips in our laptops and phones, Lidow says the material is already being used in next-generation technologies like virtual reality and self-driving cars.

But would the world really be worse off if Moore's Law stopping being true? How much more efficient must our smartphones become? Don't we waste enough time with these stupendously powerful machines? Moreover, do we really want our devices to become even better at tracking our every physical and virtual move?

"Let’s say for a moment that Moore’s Law slowed down or stopped -- our computational power stopped going down in cost and stopped going up in power," Lidow says. "I think that if everything froze we’d become an applications-oriented industry. We’d exploit more and more applications. More WhatsApps and Airbnbs and that would be the direction for the improvement of our lives.

"But here’s the things that wouldn’t be able to happen: We wouldn’t be able to communicate that information any better than we do today. So if you’re happy with your cell phone bandwidth, if you’re happy with the bandwidth that you have on your laptop, some of that can be improved, but they will be limited."

And here's the consequence that should give all entrepreneurs and venture capitalists pause: "We’d have fewer companies around innovating anything."

Lidow says one of the biggest areas where a freeze on hardware capabilities would be felt is in the Internet of Things and in ensuring that these devices can communicate with servers and each other at an acceptable level of efficiency. That may not sound too apocalyptic for consumers who can live without connected toasters gobbling up and parsing data related to how many English muffins they ate last week.

But limits on computational power could also mean limits on the infrastructure capabilities that support streaming video data, Lidow says. And as more and more content consumption has migrated from television airwaves to the Internet, the allocation of streaming resources has already become a huge issue for consumers, corporations, content creators, and politicians. Comcast, for example, was caught intentionally slowing down Netflix in order to extort higher fees from the streaming site -- fees which will likely be passed onto customers over time. With the fight over streaming regulations and net neutrality stretching from boardrooms to the halls of Congress, anything that limits innovation in this space will make what's already a complicated mess of a situation even worse.

So is gallium nitride the answer? Is it as simple as trading out all the silicon on which transistors currently sit for this cheaper, more efficient material?

We're not there yet. Lidow's team has yet to replicate or surpass the efficiency of silicon for digital applications like laptops and phones. And it may be years before this breakthrough is achieved even in a laboratory.

"That requires some innovation," Lidow says. "We’ll probably get there. I think we’re about two years away from really nailing it in the lab. And then probably two or three years from that until we see [GaN] in widespread products."

But while digital technologies make up the bulk of the $300 billion semiconductor industry, there are many other semiconducting applications for which EPC's chips can and already are used. Google, for example, already utilizes the company's GaN chips to instantly measure distances between its self-driving cars and other automobiles and objects on the road to avoid collisions.

"In order to do that extremely accurately you need something extraordinary fast," Lidow says. "If you look at Google Mapping cars, they have LIDAR (Light Distancing and Ranging) systems using our devices."

EPC's GaN chips are also used for wireless power transmission, satellite systems, docking mechanisms on space stations, and even virtual reality.

"Early generations of VR had a flaw in that you got seasick if you stand up and walk around in them," Lidow says. "We use LIDAR to scan a realtime 3D image of where you are, interpret your surroundings so it can integrate it into your surroundings. A remote surgeon can see in a three-dimensional way, there are soldier systems, new types of gaming... It’s not possible or it’s just not practical with silicon."

When defined strictly as the number of transistors that can fit on a silicon plate, Moore's Law is indeed dying. But when it comes to the spirit of the theorem, which simply means greater computing capabilities at lower costs, innovations like GaN are looking to keep Moore's Law alive. Of course, with boundless computing power come very serious concerns that technology is outpacing regulatory or social frameworks, or that some fields that stand to benefit from this progress -- particularly artificial intelligence -- pose a threat to humanity not to be taken lightly.

But Lidow is confident that the benefits will outweigh the costs.

"I’m certainly not a social philosopher and there are a lot of social and ethical questions that come up when you have ubiquitous information floating in the air. That leads to all sorts of possibilities, like machines that make moral decisions to hit the child or hit the car in front of you. Over all of that though is, the more you have the ability to process information, your chance of making the right decision is better. And the more you have low cost information, the more equalized the playing field is for all walks of life. So I think that the moral benefits are certainly huge and the potential pitfalls are huge, but they are manageable."

[illustration by Brad Jonas]