The technology has been hailed as a move towards “cognitive computing”, allowing machines to perform human-like functions such as recognition and learning.
“Imagine traffic lights that can integrate sights, sounds and smells and flag unsafe intersections before disaster happens or imagine cognitive co-processors that turn servers, laptops, tablets, and phones into machines that can interact better with their environments,” said Dharmendra Modha, the leader of the SyNAPSE project, a collaboration with American universities that has so far received $20m in funding from DARPA, the Pentagon’s technology research arm.
The new chips could also lead to more powerful, efficient computers that take up less space, according to IBM. The work aims “to create a system that not only analyzes complex information from multiple sensory modalities at once but also dynamically rewires itself as it interacts with its environment – all while rivalling the brain’s compact size and low-power usage”, the firm said.
The basis of current computing systems work, known as von Neumann processing, becomes unsustainably inefficient and complex as information from more sources is fed in.
Computer scientists have long been fascinated by how the human brain meanwhile performs incredible feats of information processing simultaneously with little power. It performs many calculations at once in different, linked locations.
A key stage in the development of the new chips was the creation two years ago of BlueMatter, a software algorithm that simulates the pattern of connections in the human brain.
Each of the two new chips contain 256 neurons, connected to each other in a way that mimics this pattern. The ultimate goal is to scale them up to tens of billions of neurons found in the human brain.
“These chips are another significant step in the evolution of computers from calculators to learning systems, signalling the beginning of a new generation of computers and their applications in business, science and government,” said Mr Modha.