Cognitive computer


A cognitive computer combines artificial intelligence and machine-learning algorithms, in an approach which attempts to reproduce the behaviour of the human brain. It generally adopts a Neuromorphic engineering approach.
An example of a cognitive computer implemented by using neural networks and deep learning is provided by the IBM company's Watson machine. A subsequent development by IBM is the TrueNorth microchip architecture, which is designed to be closer in structure to the human brain than the von Neumann architecture used in conventional computers. In 2017 Intel announced its own version of a cognitive chip in "Loihi", which will be available to university and research labs in 2018. Intel, Qualcomm, and others are improving neuromorphic processors steadily, Intel with its Pohoiki Beach and Springs systems

IBM TrueNorth chip

TrueNorth was a neuromorphic CMOS integrated circuit produced by IBM in 2014. It is a manycore processor network on a chip design, with 4096 cores, each one having 256 programmable simulated neurons for a total of just over a million neurons. In turn, each neuron has 256 programmable "synapses" that convey the signals between them. Hence, the total number of programmable synapses is just over 268 million. Its basic transistor count is 5.4 billion. Since memory, computation, and communication are handled in each of the 4096 neurosynaptic cores, TrueNorth circumvents the von Neumann-architecture bottleneck and is very energy-efficient, with IBM claiming a power consumption of 70 milliwatts and a power density that is 1/10,000th of conventional microprocessors. The SyNAPSE chip operates at lower temperatures and power because it only draws power necessary for computation. Skyrmions have been proposed as models of the synapse on a chip.

Intel Loihi chip

Intel's self-learning neuromorphic chip, named Loihi, perhaps named after the Hawaiian seamount Loihi, offers substantial power efficiency designed after the human brain. Intel claims Loihi is about 1000 times more energy efficient than the general-purpose computing power needed to train the neural networks that rival Loihi's performance.
In theory, this would support both machine learning training and inference on the same silicon independently of a cloud connection, and more efficient than using convolutional neural networks or deep learning neural networks. Intel points to a system for monitoring a person's heartbeat, taking readings after events such as exercise or eating, and using the cognitive computing chip to normalize the data and work out the ‘normal’ heartbeat. It can then spot abnormalities, but also deal with any new events or conditions.
The first iteration of the Loihi chip was made using Intel's 14 nm fabrication process, and houses 128 clusters of 1,024 artificial neurons each for a total of 131,072 simulated neurons. This offers around 130 million synapses, which is still a rather long way from the human brain's 800 trillion synapses, and behind IBM's TrueNorth, which has around 256 million by using 64 by 4,096 cores.
Loihi is now available for research purposes among more than 40 academic research groups as a USB form factor. Recent developments include a 64 core chip named Pohoiki Beach.
In March 2020, Intel and Cornell University published a research paper to demonstrate the ability of Intel's Loihi to recognize different hazardous materials, which could eventually aid to "diagnose diseases, detect weapons and explosives, find narcotics, and spot signs of smoke and carbon monoxide".

SpiNNaker

SpiNNaker is a massively parallel, manycore supercomputer architecture designed by the Advanced Processor Technologies Research Group at the Department of Computer Science, University of Manchester.

Criticism

There are many approaches and definitions for a cognitive computer, and other approaches may be more fruitful than the others.
Specifically, there are critics who argue that a room-sized computer – like the case of Watson – is not a viable alternative to a three-pound human brain. Some also cite the difficulty for a single system to bring so many elements together such as the disparate sources of information as well as computing resources. During the 2018 World Economic Forum, there were experts who claimed that cognitive systems could adopt the biases of their developers and that this was demonstrated in the case of the Google image-recognition or computer vision algorithm, which identified African Americans unfavorably.

Links

http://www.foxnews.com/tech/2018/01/09/ces-2018-intel-gives-glimpse-into-mind-blowing-future-computing.html