Primary tab navigation

Cognitive systems: A new era of computing


Over the past few decades, Moore's Law, processor speed and hardware scalability have been the driving factors enabling IT innovation and improved systems performance. But the von Neumann architecture—which established the basic structure for the way components of a computing system interact—has remained largely unchanged since the 1940s. Furthermore, to derive value, people still have to engage with computing systems in the manner that the machines work, rather than computers adapting to interact with people the way they work.

Today we stand poised on the brink of a new era of computing in which technology is more consumable, insight-driven and cognitive.  IBM Research is exploring and developing the enabling technologies that will transform the way computers are used. - Ginni Rometty IBM President and CEO With the continuous rise of big data, that's no longer good enough.

We now are entering the Cognitive Systems Era, in which a new generation of computing systems is emerging with embedded data analytics, automated management and data-centric architectures in which the storage, memory, switching and processing are moving ever closer to the data.

Whereas in today's programmable era, computers essentially process a series of "if then what" equations, cognitive systems learn, adapt, and ultimately hypothesize and suggest answers. Delivering these capabilities will require a fundamental shift in the way computing progress has been achieved for decades.

The four characteristics of cognitive systems


They are data-centric

The volume of data produced today isn't just increasing—it's getting faster, taking more forms and is increasingly uncertain in nature. Uncertainty arises from such sources as social media, imprecise data from sensors and imperfect object recognition in video streams. IBM experts believe that by 2015, 80 percent of the world's data will be uncertain.

They are designed for statistical analytics

Watson, the Jeopardy-winning system, is an early example. When Watson answers a question it analyzes uncertain data, and develops a statistical ranking and a level of confidence in its answers. It then goes "offline" for additional training to refine its capabilities. In the future, Watson will be able to engage in interactive dialog with people, develop evidence profiles revealing the source of its answers, and engage in continuous learning based on its own experiences.


Data-centric - Designed for statistical analysis - Automated system and workload management - Scale-in architecture

These systems "scale-in"

Historically, performance improvements in IT systems have come from scaling down (Moore's law, which describes how semi-conductors become more that the density of semi-conductors become more powerful and more compact); scaling up (more powerful processors added to a single system), and scaling out (linking together more and more processors or entire systems in parallel).

In cognitive systems, performance improvements will derive from scaling in: moving key components, such as storage, memory, networking and processing onto a single chassis, closer to the data. Netezza and the new IBM PureSystems are the first commercially available examples of scaling in. In the future these capabilities will move even closer to the data, scaling-in computing elements first in a single drawer or card and eventually onto a single, three-dimensional chip module. This scale-in effect will reduce the latency that can occur when trying to move terabytes or exabytes of data around a computing system.

They automate system and workload management

Deploying applications in an enterprise environment often requires that multiple virtual machines be configured manually, a complex, time-intensive process prone to error. For the new PureSystems, IBM Research scientists developed software tools to create and manipulate blocks of code so users can drag-and-drop the pieces they need for compute power, storage and software applications. And the blocks already know how to connect to one another and across multiple virtual machines.

Three areas of new exploration

Even though IBM is already leading in this new era, the company is not satisfied to focus on continuous improvements of existing capabilities. IBM is performing far-reaching, exploratory research on core technologies, applications and architectures that will sustain this new era well into the next decade.

Core Technologies

As Moore's law begins to reach its physical limits, IBM Research is exploring core transformational technologies to enable processing at the atomic level. IBM researchers, for example, recently demonstrated the ability to store a bit of information in as few as 12 magnetic atoms; today's disk drives use about one million atoms to store a single bit. A recent breakthrough in quantum computing—which harnesses the properties of sub-atomic particles to create hyper-efficient We're not trying to build a brain, we're trying draw inspiration from the brain. – Dharmendra Modha Manager, Cognitive Computing calculation capabilities—indicate the potential for a qubit bit system that could factor a 3,000 digit number at a rate of 1040 faster than is possible today.

Architectures

Simulating the brain's neuron-and-synapse model in new computing architectures might open new avenues for high-performance, energy-efficient computing systems. In 2011, the SyNAPSE project, conducted by IBM Research in collaboration with DARPA, yielded its first cognitive computing chip, called True North. The chip simulates the phenomena between spiking neurons and synapses in the brain, through advanced algorithms and silicon circuitry. Its first two prototype chips have been fabricated and currently are undergoing testing.

New applications

Tomorrow's advanced applications will revolutionize technology as well as the industries in which they're applied. The advent of social business and the widespread adoption of social networking technologies opens new areas of possibility as entire networks of knowledge and expertise can be connected and optimized in ways similar to the optimization of supply chains. With Watson 2.0, the ability to engage in dialogue with humans and to learn on the fly has enormous implications if applied in medicine, finance or other industries.

Anyone else undertaking these types of grand challenges would inevitably face this fact: without decades of research into systems, semiconductors, software and services, along with their underlying chemical, electrical, biological and computational bases, leadership in the cognitive systems era would be nearly impossible.

For IBM, it's the logical next step toward becoming the world's most essential company.

Share this story




Related stories