ibm_logo_1

Julius Kielaitis/Shutterstock

General


Sometime in the next several years, one of the underpinnings of the digital age will run out of steam: Moore’s Law. Today, computing giant IBM said it will invest $3 billion over the next five years to research what will happen when it does.

It’s one of those uncomfortable facts that, like global warming, you hope science will solve before it actually interferes with your life. But what it comes down to is this: For nearly a half-century, the transistors and other elements on silicon chips have been shrinking at a more or less predictable rate, meaning the chips themselves have gotten smaller. That has in turn led to chips becoming ever more powerful and efficient computing engines, and thus the primary building blocks of our daily digital habits.

But like everything else, the materials used to build chips are governed by the laws of physics. And the basic fact is that chips can get maybe a little smaller before they stop working right. The current generation of chips, due later this year, have reached 14 nanometers. For a sense of scale, that’s only a tad thicker than the wall of an individual cell. The technology roadmap goes to about seven nanometers, roughly the diameter of a hemoglobin molecule.

Bernie Meyerson, an IBM fellow who also carries the title of chief innovation officer, put it another way: “Silicon has a physical limit at which things get so small it goes quantum mechanical,” he said. “That’s a polite way of saying it renders itself useless.”

In recent years, chip companies like IBM, Intel and others have, he says, essentially been playing tricks to keep the electrons on chips flowing in the desired way. But the bag of understood tricks runs out beyond the seven-nanometer generation. After that, Meyerson says, progress will require some new material or some new ways of building chips. The $3 billion investment will be put toward exploring that territory.

There are a few avenues to explore. There’s quantum computing, an exotic branch of computing technology that moves beyond the basic bit of ones and zeroes to qubits that can maintain more than one state at once — essentially one and zero at the same time. Quantum computers can solve certain kinds of computing problems much faster than traditional computers.

There’s neurosynaptic computing, where chips are designed to operate more like human brains. Big Blue has done some work in this field and has a long-term goal to build a system with 10 billion neurons and a hundred trillion synapses that uses only a kilowatt of power while fitting roughly into the space of a half-gallon carton of milk.

There’s also silicon photonics, a field where the electrons on a chip would be essentially supplanted by individual particles of light. Light is already a proven method for moving information — consider all the fiber optic cables carrying Internet and voice traffic. The thinking goes that it could be just as useful inside a computer as between computers.

The good news is that we’ve been here before. In the 1980s, the then-current technology for building chips known as bipolar junction transistors had reached the end of its life. IBM was one of the companies that made a big bet on CMOS technology (pronounced SEA-moss, it stands for complementary metal oxide semiconductor), which is the basic technology behind today’s chips. As Meyerson put it: “We bet the farm on CMOS, and initially it wasn’t as good as what it was meant to replace, but we knew we could make it better. The companies who stuck with the old technology don’t exist anymore.”

That’s what IBM is seeking to avoid. The point of the new research is to investigate all the different avenues for new technologies, rule out the ones that won’t work and press ahead on the ones that show more promise. “The trick is to fail fast,” he said.

Although IBM is doing research on the next generation of chips, it will not actually build them. News of the new resources comes against the backdrop of persistent rumors that IBM wants to sell its chip-making business unit, IBM Microelectronics. In April, The Wall Street Journal reported that IBM was close to a deal to sell the business to GlobalFoundries, the privately held chip manufacturer.



1 comments
TinkerTom
TinkerTom

Interesting read, but I'd guess that what we'll see is an initial jump when we move to one of the 3 technologies mentioned, then back to Moore's Law again, just as with CMOS. 


If you subscribe to the notion of the Singularity, or even acknowledge the increase in the pace of technological development, we would imagine that this problem will be easier to solve now than the CMOS problem of the 80s just due to the increase in domain knowledge and the increase in the power of the tools we wield to solve it. 


Should be interesting to see what happens in the next 5 years, that's for sure.

Follow

Get every new post delivered to your Inbox.

Join 301,287 other followers