Allison Herreid / Shutterstock


Can a silicon chip act like a human brain? Researchers at IBM say they’ve built one that mimics the brain better than any that has come before it.

In a paper published in the journal Science today, IBM said it used conventional silicon manufacturing techniques to create what it calls a neurosynaptic processor that could rival a traditional supercomputer by handling highly complex computations while consuming no more power than that supplied by a typical hearing aid battery.

The chip is also one of the biggest ever built, boasting some 5.4 billion transistors, which is about a billion more than the number of transistors on an Intel Xeon chip.

To do this, researchers designed the chip with a mesh network of 4,096 neurosynaptic cores. Each core contains elements that handle computing, memory and communicating with other parts of the chip. Each core operates in parallel with the others.

Multiple chips can be connected together seamlessly, IBM says, and they could be used to create a neurosynaptic supercomputer. The company even went so far as to build one using 16 of the chips.

The new design could shake up the conventional approach to computing, which has been more or less unchanged since the 1940s and is known as the Von Neumann architecture. In English, a Von Neumann computer — you’re using one right now — stores the data for a program in memory.

This chip, which has been dubbed TrueNorth, relies on its network of neurons to detect and recognize patterns in much the same way the human brain does. If you’ve read your Ray Kurzweil, this is one way to understand how the brain works — recognizing patterns. Put simply, once your brain knows the patterns associated with different parts of letters, it can string them together in order to recognize words and sentences. If Kurzweil is correct, you’re doing this right now, using some 300 million pattern-recognizing circuits in your brain’s neocortex.

The chip would seem to represent a breakthrough in one of the long-term problems in computing: Computers are really good at doing math and reading words, but discerning and understanding meaning and context, or recognizing and classifying objects — things that are easy for humans — have been difficult for traditional computers. One way IBM tested the chip was to see if it could detect people, cars, trucks and buses in video footage and correctly recognize them. It worked.

In terms of complexity, the TrueNorth chip has a million neurons, which is about the same number as in the brain of a common honeybee. A typical human brain averages 100 billion. But given time, the technology could be used to build computers that can not only see and hear, but understand what is going on around them.

Currently, the chip is capable of 46 billion synaptic operations per second per watt, or SOPS. That’s a tricky apples-to-oranges comparison to a traditional supercomputer, where performance is measured in the number of floating point operations per second, or FLOPS. But the most energy-efficient supercomputer now running tops out at 4.5 billion FLOPS.

Down the road, the researchers say in their paper, they foresee TrueNorth-like chips being combined with traditional systems, each solving problems it is best suited to handle. But it also means that systems that in some ways will rival the capabilities of current supercomputers will fit into a machine the size of your smartphone, while consuming even less energy.

The project was funded with money from DARPA, the Department of Defense’s research organization. IBM collaborated with researchers at Cornell Tech and iniLabs.


horrible grammar in my last comment, but you all know what I am saying.


They are forgetting that brain also store data for a program in memory, many programs in case of the human brain.  How do you think it processes the patterns. Everything needs memory, even our brains.  The data has to be stored somewhere!  Other than that, a great article.


@DJMullen Thats not what they are saying...  A traditional processor will process 1 +1 everytime it sees it by adding 1 to 1.  Our brains and apparently this processor doesn't work like that.  Our brains function more like a boolean flow chart.  First Gate no second gate yes ok answer is two.  It recognizes the pattern rather than performing the operation and it is recognizing the pattern a processor level meaning that it doesnt require the use of cache to get there.  Think about when you were learning multiplication did they teach you to keep adding tick marks or did they try to get you to memorize a chart.  Yes I said memory but its more like gate paths than an internal store thats why you never forget multiplication.  A better example is you get on a subway on 5th Street pass two platforms you know your on 8th even though you dont have to add 5 + 3.


Get every new post delivered to your Inbox.

Join 288,602 other followers