At last week’s IBM Smarter Computing Forum we had a session on Watson, IBM’s artificial intelligence machine which won Jeopardy last year and another session on IBM sponsored research helping to create the SyNAPSE digital neuromorphic computing chip.
Putting “Watson to work”
Apparently, IBM is taking Watson’s smarts and applying it to health care and other information intensive verticals (intelligence, financial services, etc.). At the conference IBM had Monoj Saxena, senior director Watson Solutions and Dr. Herbert Chase, a professor of clinical medicine a senior medical professor from Columbia School of Medicine come up and talk about Watson in healthcare.
Mr. Saxena’s contention and Dr. Chase concurred that Watson can play at important part in helping healthcare apply current knowledge. Watson’s core capability is the ability to ingest and make sense of information and then be able to apply that knowledge. In this case, using medical research knowledge to help diagnose patient problems.
Dr. Chase had been struck at a young age by one patient that had what appeared to be an incurable and unusual disease. He was an intern at the time and was given the task to diagnose her issue. Eventually, he was able to provide a proper diagnosis but it irked him that it took so long and so many doctors to get there.
So as a test of Watson’s capabilities, Dr. Chase input this person’s medical symptoms into Watson and it was able to provide a list of potential diagnosises. Sure enough, Watson did list the medical problem the patient actually had those many years ago.
At the time, I mentioned to another analyst that Watson seemed to represent the end game of artificial intelligence. Almost a final culmination and accumulation of 60 years in AI research, creating a comprehensive service offering for a number of verticals.
That’s all great, but it’s time to move on.
SyNAPSE is born
In the next session IBM had Dr. Dharmenrad Modta come up and talk about their latest SyNAPSE chip, a new neueromorphic digital silicon chip that mimicked the brain to model neurological processes.
We are quite a ways away from productization of the SyNAPSE chip. Dr. Modha showed us a real-time exhibition of the SyNAPSE chip in action (connected to his laptop) with it interpreting a handwritten numeral into it’s numerical representation. I would say it’s a bit early yet, to see putting “SyNAPSE to work”.
Digital vs. analog redux
I have written about the SyNAPSE neuromorphic chip and a competing technology, the direct analog simulation of neural processes before (see IBM introduces SyNAPSE chip and MIT builds analog synapse chip). In the MIT brain chip post I discussed the differences between the two approaches focusing on the digital vs. analog divide.
It seems that IBM research is betting on digital neuromorphic computing. At the Forum last week, I had a discussion with a senior exec in IBM’s STG group, who said that the history of electronic computing over the last half century or so has been mostly about the migration from analog to digital technologies.
Yes, but that doesn’t mean that digital is better, just more easy to produce.
On that topic, I asked the Dr. Modha, on what he thought of MIT’s analog brain chip. He said
- MIT’s brain chip was built on 180nm fabrication processes whereas his is on 45nm or over 3X finer. Perhaps the fact that IBM has some of the best fab’s in the world may have something to do with this.
- The digital SyNAPSE chip can potentially operate at 5.67Ghz and will be absolutely faster than any analog brain simulation. Yes, but each analog simulated neuron is actually one of a parallel processing complex and with a 1’000 or a million of them operating even 1000X or million X slower it’s should be able to keep up.
- The digital SyNAPSE chip was carefully designed to be complementary to current digital technology. As I look at IT today we are surrounded by analog devices that interface very well with the digital computing environment, so I don’t think this will be a problem when we are ready to use it.
Analog still surrounds us and defines the real world. Someday the computing industry will awaken from it’s digital hobby horse and somehow see the truth in that statement.
In any case, if it takes another 60 years to productize one of these technologies then the Singularity is farther away than I thought, somewhere around 2071 should about do it.