New research creates a computer chip that emulates human cognition

Developed in part by Yale's Rajit Manohar, the TrueNorth chip is a pioneering example of neuromorphic technology — circuitry modeled after the human brain.
A microchip manufacturing device bearing the IBM logo.

This article originally appeared in Yale Engineering magazine.

Imagine working in an office where, once you’ve finished one task, you had to wait until everyone in all the other cubicles completed the tasks they were working on before you could move on to your next assignment.

That’s how most digital devices that rely on synchronous circuits work. Built-in clocks allow the same amount of time for the completion of each computational function. Based on a binary system of ones and zeros, it’s reliable, but it also means that the system can run only as fast as the slowest function in the chain.

In a clocked implementation, everything has to fit into a time budget, so unless you make everything faster, your chip doesn’t run faster — and ‘everything’ includes things you don’t always need,” said Rajit Manohar, the John C. Malone Professor of Electrical Engineering and Computer Science.

Even before Siri and Google Home became our household companions, we’ve had a tendency to anthropomorphize computers. It’s long been common for people to speak of computers in terms of “thinking” and to ascribe them brain-related traits. In truth, though, conventional computers really don’t function like brains at all. But computer science is getting closer.

One sign of this is TrueNorth, a 4-square-centimeter chip that possesses some 5.4 billion transistors, and 1 million “neurons” that communicate via 256 million “synapses.” Starting while he was a faculty member at Cornell, Manohar came to work on the chip with a team of IBM researchers in a years-long collaboration that resulted in TrueNorth. Funded by the Defense Advanced Research Projects Agency (DARPA) as part of its Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) program, TrueNorth is a pioneering example of the neuromorphic chip — a new breed of computer circuitry modeled after the brain. It’s the size of a postage stamp and it could be the start of a revolution in how we make and use computers.

Manohar, who started at Yale in January, came to the project through his work with asynchronous systems, one of his research specialties. In devices with these types of circuits, each function is allowed as little or as much time as needed to complete its task. “It’s like a relay race — you hand the baton to the next person when you’re there,” he said. To allow for greater complexity and use much less energy, all of these functions work asynchronously and in parallel — similar to how neuroscientists believe the brain operates.

There’s clearly not a single, carefully synchronized signal that goes to every single neuron in your brain, so it seems that asynchrony is a natural way to think about how computation there occurs,” Manohar said.

“The brain is an asynchronous system that we don’t really understand very well, and it can do certain things that we don’t know how to get computers to do today — and that’s interesting," says Rajit Manohar.

Although asynchronous systems are often thought of as a new branch of study within computer science, their roots go back to the earliest versions of the modern computer. Manohar notes that even the blueprint of the modern computer (the “Von Neumann” machine) from the 1940s explains that asynchronous computation is advantageous. Many early machines were built this way, but computer architecture soon grew in complexity and included a lot more wires. Ensuring that a signal was sent and received correctly within the machine got trickier. An internal timekeeper was needed to make sure that things ran properly, and synchronous circuits became the law of the land.

What the machines gained in orderliness, though, they lost in speed. Take for instance, the computer in your phone. It’s running at 1 GHz — a billion steps per second — so every step has to fit in one nanosecond. Whatever you’re calculating has to be subdivided into equal blocks of time. If one step finishes early, you have to wait. That can add up to a lot of wasted time.

Frankly, it’s rare that you have computation where individual things all take the same amount of time,” he said. “Not all computations are equally difficult.”

If one step takes too long, an error occurs. In that case, the process has to be broken into smaller steps, or the step size has to be bigger — and that slows everything else down.

Nonetheless, this didn’t pose much concern until the 1980s, when chips started getting bigger and more complicated and the clocks used to keep up with the computing power got more and more expensive to run — taking up as much as 20 percent of a chip’s power consumption.

So people started looking at asynchronous circuits again in the early ‘80s.”

The neurons of TrueNorth work in parallel with each other, each doing what it needs to do to complete a task. They communicate via bursts of electric current, known as spikes. One of the most remarkable things about TrueNorth is how power-efficient it is. Drawing 70 milliwatts of power — equal to that of a hearing aid — its consumption is miniscule compared to conventional computers performing similar tasks.

Dharmendra Modha, lead researcher of the Cognitive Computing group at IBM Almaden Research Center and principal investigator of the DARPA SyNAPSE project, said he recruited Manohar because he’s a “world leader” in the technology required for the project and he had developed “powerful and proven tools.”

Neurons in the brain are event-driven and operate without any synchronizing clock,” Modha said. “To achieve the ambitious metrics of DARPA SyNAPSE, a key element was to design and implement event-driven circuits for which asynchronous circuits are natural.”

The TrueNorth chip in detail.

Neuroscience has given us a much better understanding of what’s happening in the brain, and that information inspired the architecture of the TrueNorth chip. But it’s a stretch to call TrueNorth a copy of the brain’s functions since we still don’t know exactly how the brain works. That’s one of the things that fascinates Manohar about his work.

The brain is an asynchronous system that we don’t really understand very well, and it can do certain things that we don’t know how to get computers to do today — and that’s interesting,” he said. Also, there’s evidence that the brain has a “massively powerful asynchronous computational substrate” that can learn how to do a lot of different applications.

And it can execute those applications at an efficiency that we don’t know how to do on a computer. That’s also interesting.”

Many other efforts in neuromorphic computing start with the aim of better understanding how the brain works. The makers of TrueNorth approached their project from the other direction; how can the processes of the brain make for better computing? That also suits Manohar’s interests.

I’m not in it to understand the biology. I’m in it to understand how it does this computation.”

To see what kind of real-world applications TrueNorth might have, the research team developed a multi-object detection and classification application and tested it with two challenges: one was to detect people, bicyclists, cars, trucks, and buses that appear periodically on a video; the other was to correctly identify each object. TrueNorth proved adept at both tasks.

Even if it captures just a fraction of the human brain’s complexity — according to its makers, the chip has the brain power of a bumblebee — that’s enough to accomplish some remarkable tasks. For instance, it allows users to change the channel without touching the TV or a remote control. Samsung, which has evaluated the TrueNorth chip, announced that it is developing a system in which TV users can control their sets simply by gesturing. Officials at the Los Alamos National Lab have also discussed using it for some supercomputing calculations.

Manohar is also the founder of Achronix Semiconductor, a company that specializes in high-performance asynchronous field programmable gate arrays (FPGA) chips. MIT Technology Review listed him as one of “35 Innovators Under 35” for his work on low-power microprocessor design. His other specialties include low-power embedded systems, concurrent systems, and formal methods for circuit design.

Manohar says he came to computer science by way of mathematics.

At some point, I wanted to use mathematics for something more applied,” he said. “I thought computer science was interesting from an applied math perspective — a lot of the techniques and some of the foundations are very mathematical.”

The unprecedented nature of TrueNorth meant a huge amount of resources were put into it. Not only did the research team invent the chip, they needed to invent the tools used to build it, since existing current computer-assisted design (CAD) software wasn’t adequate.

One of the things that prevents people from working on asynchronous circuits are the lack of tools to design them,” he said. “There’s a huge industry that spends billions of dollars each year improving these CAD tools, but they aren’t tailored to the work we’re doing on asynchronous design, so we have to write our own CAD tools.”

Since the unveiling of TrueNorth, the number of researchers working on asynchronous circuits has increased significantly, but it’s still a small community. The CAD software that Manohar’s team used was designed specifically for the team’s use. But if they can modify them to be more universal, Manohar believes the field will break out, and the technology will advance even more rapidly.

One of the things we want to do is to have a complete set of tools that we could put in open source and let other researchers use. Often I hear from people in industry say ‘Hey I’d like to try this, but I don’t know how to start because I don’t have the tools.’”

The benefits of thinking like a brain

The architecture of today’s conventional computers still derive from the Von Neumann model of the 1940s. We don’t use the cardboard punch cards, but the basic idea is still the same. Advances have lessened how long it takes for the memory to transfer data to the processor. But the data still needs to shuttle back and forth, and that requires time and power. For decades, computers have steadily shrunk in size but grown in power. Computer scientists, though, say we’re getting close to the limit of how much we can keep souping up processors. Neuromorphic chips could break open a whole new field that will allow the trend to continue, quite possibly at an even quicker pace.

One of the radical departures from conventional systems is that the storage of data on TrueNorth and the calculation of it aren’t separated. Its neural network can work multiple tasks without the timekeeping mechanism, breaking free of the linear operation that bogs down conventional operations.

Then there’s the matter of what these chips can allow computers to do. Conventional computers are great at brute force calculations. They’re less adept at recognizing faces or picking out specific voices and tasks that involve pattern recognition. That’s why those CAPTCHA functions that instruct you to pick out Einstein’s face or copy a short alphanumeric pattern to prove you’re human are so effective at keeping out bots.

While neuromorphic computing has advanced greatly since computer scientists first began seriously discussing it in the 1980s, the field is still in the early stage, and many in the field are excited about what can be done with the chips as the technology becomes more sophisticated. As with any potentially game-changing technology, it’s impossible to imagine all possible commercial applications, but many in the field say neuromorphic chips could be key to realizing ready-for-primetime self-driving cars, more human-like robots, and devices to help people with visual impairments.

Of course, getting to that point is no small task. Manohar is currently working with a team of researchers from the University of Waterloo and Stanford University on a multichip system that Manohar says would be the next step forward in neuromorphics.

We’d like to demonstrate significantly higher efficiency compared to all the existing platforms — that’s always the goal,” he said. “We think we know how to do that.”

He predicts it won’t be long before this kind of technology ends up in everyday devices.

These neurocomputing algorithms currently provide state-of-the-art performance for tasks like object detection and recognizing faces — tasks that a lot of companies care about today,” he said. “Imagine having photos or videos that you search for in the same way that you search for text today; these types of chips are way more efficient at that kind of computation.”

Share this with Facebook Share this with X Share this with LinkedIn Share this with Email Print this

Media Contact

William Weir: william.weir@yale.edu, 203-432-0105