New Computer Architecture Inspired By The Human Brain Can Help Advance AI Tech


IBM is developing what could be a game-changing computer architecture that is better equipped to handle data from artificial intelligence.

In a new study, a team of researchers designed a system based on the human brain, particularly the memory and neurons and synapses. They claim that the new architecture is far more efficient than conventional computers that are based on the design developed by physicist and mathematician John von Neumann in the 40s.

The findings were published in the Journal of Applied Physics.

As Smart As The Human Brain

In the new design, researchers combined the processing and memory units of a computer. Regular computing systems have a central processor responsible for executing logic and computing tasks, a memory unit, mass storage, and input and output devices.

However, the human brain does not need the same components: the synapses function as both processing and storing units. Combining the processor and the memory unit would be a lot more efficient and would save power.

"If you look at human beings, we compute with 20 to 30 watts of power, whereas AI today is based on supercomputers which run on kilowatts or megawatts of power," explained Abu Sebastian, one of the authors of the study. "In the brain, synapses are both computing and storing information. In a new architecture, going beyond von Neumann, memory has to play a more active role in computing."

The researchers also looked at the brain's synaptic network structures to develop arrays of phase change memory devices that will "accelerate training for deep neural networks." With electrical pulses, the researchers were able to create a storage that can store more information in a single nanoscale device.

Changing The Game

Sebastian and his colleagues from IBM have already tested their design with an unsupervised machine learning algorithm last year. For comparison, they ran the same test on a conventional computer.

"We always expected these systems to be much better than conventional computing systems in some tasks, but we were surprised how much more efficient some of these approaches were," he stated.

The prototype of the new computing system that IBM researchers have created recorded 200 times faster performance than the conventional von Neumann computing system.

ⓒ 2018 All rights reserved. Do not reproduce without permission.
Real Time Analytics