Last week, in the GDAT class, we were discussing performance visualization tools as requiring a good impedance match between the digital computer under analysis and the cognitive computer of the analyst—AKA the brain.
This week, IBM will announce its first generation cognitive computing chip that they claim mimics the human brain. It is the result of Phase 2 of a grant for DARPA's Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) project awarded more than 3 years ago to half a dozen IBM labs and 5 major universities.
According to IBM, "BlueMatter, a new algorithm created by IBM researchers in collaboration with Stanford University, exploits the Blue Gene supercomputing architecture in order to noninvasively measure and map the connections between all cortical and sub-cortical locations within the human brain using magnetic resonance diffusion weighted imaging. Mapping the wiring diagram of the brain is crucial to untangling its vast communication network and understanding how it represents and processes information." [Source: CNET]
These so-called cognitive chips have two prototypes that are currently being tested. The semiconductors were created out of standard technology in IBM’s VLSI fab plants. Both cores were fabricated using a 45 nm process and feature 256 neurons. One core contains 262,144 programmable synapses—a social network on a chip, as ZDNet's Larry Dignan describes it—and 65,536 learning synapses. IBM has demonstrated navigation, machine vision, pattern recognition, associative memory and classification with these chips.
So, what does it do for you? Here, IBM hedges somewhat. It wants you to think of it as complementary to conventional von Neumann digital processors. Taken together both types of chips could be used to correlate data, create hypotheses and remember state in a broader sense than a RAM chip. The combinaton of these two types of chips would be a cognitive computer.
No word on GA or price.