Neurosynaptic chips

Building blocks for cognitive systems

Share

The SyNAPSE project

In many ways computers today are nothing more than very fast number-crunchers and information manipulators. They can process lots of data, but they really don’t think. They all adhere to the Von Neumann architecture, largely unchanged in the last half-century, in which computers are constructed by separating memory and processing and operate by executing a series of pre-written "if X then do Y" equations. With the advent of Big Data, which grows larger, faster and more diverse by the day, this type of computing model is inadequate to process and make sense of the volumes of information that people and organizations need to deal with.

In searching for an answer, IBM researchers found inspiration for a new computer chip design from the most powerful, efficient information processing device in the world: the human brain. The cognitive capabilities of the brain includes understanding the surrounding environment, dealing with ambiguity, acting in real time and within context – all while consuming less power than a light bulb and occupying less space than a two-liter bottle of soda.

In August 2011, as part of the SyNAPSE (Systems of Neuromorphic Adaptive Plastic Scalable Electronics) project IBM researchers led by Dharmendra S. Modha successfully demonstrated a building block of a novel brain-inspired chip architecture based on a scalable, interconnected, configurable network of “neurosynaptic cores” that brought memory, processors and communication into close proximity. These new silicon, neurosynaptic chips allow for computing systems that emulate the brain's computing efficiency, size and power usage.



We are working to create a FORTRAN for neurosynaptic chips. While complementing today’s computers, this will bring forth a fundamentally new technological capability in terms of programming and applying emerging learning systems.

Dr. Dharmendra S. Modha, Principal Investigator and Senior Manager, IBM Research


Papers about the SyNAPSE software ecosystem:

New chips - new programming language

With these new neurosynaptic chips comes the need for a new programming language to enable the development of new sensory-based cognitive computing applications. Modha and his team have developed a new software ecosystem that support all aspects of the programming cycle - from design through development, debugging and deployment -- and could enable a new generation of applications that mimic the brain’s abilities for perception, action and cognition.

When the initial prototypes of the chips were created, each neurosynaptic core had to be programmed. To make cognitive applications easier to build and to help create an ecosystem of application developers, the team has created composable, reusable building blocks of these cores called corelets. Each corelet has a particular function that can be put together in different configurations to create new applications. For example, a corelet could include all of the individual cores that perceive sound. The programmer could use that corelet in conjunction with others that represent edge detection and color identification to develop a new application that takes advantage of all those features.

These corelets allow developers to create apps without programming individual neurosynaptic cores - all they would have to know is the overall task of a particular corelet. While the actual cognitive chip has yet to be released, developers can begin to write code with this new method by using a simulator that the team developed to create and test their ideas. So far, a library of 150 corelets has been developed by IBM researchers with plans to allow third parties to go through a rigorous testing process to submit more. IBM is ready to place the new technology, the new tools, and the new way to thinking in hands of fellow IBMers, academic researchers, business partners and clients who can begin building real-life cognitive systems. To that end, IBM will develop a teaching curriculum and guide on how to program corelets.



Cognitive applications

Because these neurosynaptic chips are small and low-power with corelets enabling sensory perception, a whole new range of applications can be built. Below are some conceptual designs that the team has envisioned:

tumbleweed

An autonomous robot in the shape of a sphere with multi-modal sensing, including image and sound, could be deployed in a disaster area for search and rescue missions. An internal mechanism would allow it to roll around an environment to survey areas and identify persons in need, the condition of the zone and possible hazards. It could also communicate with people it finds and guide them to safety through speakers and a video display.

assistive vision glasses

Low-power, light-weight eye glasses designed to help the visually impaired could be outfitted with multiple video and auditory sensors that capture and analyze an area to recognize objects and obstacles and help the user navigate safely through the area. This could be communicated through audio cues for those with no sight, or enhanced visual cues for those with limited sight.

home health applications

A thermometer that could not only measure a user’s temperature but could also be outfitted with a camera to share images that may be of concern directly to doctors. The thermometer could also be outfitted with sensors to detect smell and recognize certain bacterial presence based on the unique odor they give off and alert the user if medical attention is needed.

home health applications

By placing the conversation flower on a table during a meeting and using audio and video processing, the device would be able to identify specific speakers by their voice and appearance and automatically create a transcript, correctly identifying whoever is speaking. The device also opens like a flower when the conversation becomes vibrant and animated.

jellyfish

A network of sensor buoys could be created to monitor shipping lanes for safety and environmental protection. These solar powered devices would blend into the environment while protecting it – monitoring the environment, looking for mines and reporting hazards.



The future of cognitive chips

IBM’s long-term goal is to build a neurosynaptic chip system with ten billion neurons and hundred trillion synapses, all while consuming only one kilowatt of power and occupying less than two liters of volume.

These new neurosynaptic chips will be complementary to cognitive systems like Watson, namely, the right-brain and left-brain of cognitive systems. Watson, the left-brain, focuses on language and analytical thinking - the cognitive chips address senses and pattern recognition. Over the coming years, IBM scientists hope to meld the two capabilities together, just as the human right-brain functions are combined with the left-brain functions, to create a holistic computing intelligence.



Meet the scientists

  • Thumbnail image of Andrew Cassidy

    Andrew Cassidy

    Research Staff Member, Cognitive Computing
    IBM Research - Almaden

  • Thumbnail image of Arnon Amir

    Arnon Amir

    Research Staff Member,
    IBM Research - Almaden

  • Thumbnail image of Ben Shaw

    Ben Shaw

    Research Staff Member, Cognitive Computing,
    IBM Research - Almaden

  • Thumbnail image of a cat brain diagram

    SyNAPSE team

    Meet the team comprised of researchers from Almaden, Austin, Watson and India


Publications