IBM Research has started work with the universities of Stanford, Wisconsin-Madison, Cornell, Columbia and California-Merced to create computing systems that emulate the brain’s abilities for sensation, perception and cognition.
Big Blue says that without the ability to monitor, analyse and react to exponentially growing information in real-time, much of its value may be lost – and that cognitive computing offers a solution.
“Exploratory research is in the fabric of IBM’s DNA,” says Josephine Cheng, IBM fellow and vice president of its Almaden Research Centre in San Jose. “We believe that our cognitive computing initiative will help shape the future of computing in a significant way, bringing to bear new technologies that we haven’t even begun to imagine.”
She says that by seeking inspiration from the structure, dynamics, function, and behaviour of the brain, the IBM-led research team aims to break the conventional programmable machine mould.
Ultimately, she says, the team hopes to rival the brain’s low power consumption and small size by using nanoscale devices for synapses and neurons.
“This technology stands to bring about entirely new computing architectures and programming paradigms,” she says.
If she’s right, IBM expects to create a machine that can integrate and analyse vast amounts of data from many sources in the blink of an eye, allowing businesses and individuals to make rapid decisions in time to have a significant impact.
The end game: ubiquitously deployed computers imbued with a new intelligence that can integrate information from a variety of sensors and sources, deal with ambiguity, respond in a context-dependent way, learn over time and carry out pattern recognition to solve difficult problems based on perception, action and cognition in complex, real-world environments.