IBM's Watson Cognitive AI Platform Evolves, Senses Feelings And Dances Gangnam Style At NVIDIA GTC

IBM Watson Robot

Artificial Intelligence is sort of the holy grail of computing, and while we may never reproduce the human brain or it's capabilities in their entirety in electronic form, the world's brightest minds are now capable of creating rather convincing systems that are beginning to learn more sophisticated concepts and even learn on their own. IBM is proving that with its constantly improving Watson cognitive computing technology, which one day might have us struggling to perceive the line between AI and organic intelligence.

You might be surprised at the progress that's been made. In the early 1990s, Creative Labs bundled an amusing bit of software with its Sound Blaster card called Dr. Sbaitso. It was an AI speech synthesis program that interacted with end users as a psychologist, and while it was never meant to be taken seriously, it provided an early glimpse of where AI was headed.

Now over two decades later we're conversing with Watson, and the capabilities of what IBM has created are nothing short of amazing. That much was proven when, five years ago, Watson competed against the two greatest Jeopardy players of all time and defeated them without too much trouble to take home the game show's championship $1 million prize.

IBM Watson Timeline

What has Watson been up to since then? IBM has focused its efforts on more practical applications for Watson, including services that are used by doctors, lawyers, and other professionals that need to analyze massive amounts of data. The challenge IBM faces is keeping Watson fresh, with the world's devices producing some 2.5 exabytes of data every day that is expected to blow-up to 44 zettabytes by the year 2020. To keep up with the information overload, IBM announced late last year that it was adding NVIDIA's Tesla K80 processing engines to the mix. Those high performance compute GPUs are playing a key role in Watson's cognitive computing development, especially in terms of natural language processing capabilities.


The result? Watson is more capable and human-like than ever before, especially when injected into a robot body. We got to see this first-hand at NVIDIA's GPU Technology Conference (GTC) when Rob High, an IBM fellow, vice president, and chief technology officer for Watson, introduced attendees to a robot powered by Watson. During the demonstration, we saw Watson in robot form respond to queries just like a human would, using not only speech but movement as well. When Watson's dancing skills were called into question, the robot responded by showing off its Gangnam Style moves.

IBM Watson Slide NVIDIA GTC

This is the next level of cognitive computing that's beginning to take shape now, both in terms of what Watson can do when given the proper form, and what it can sense. Just like a real person, the underlying AI can get a read on people through movement and cognitive analysis of their speech. It can determine mood, tone, inflection, and so forth.

"We think this is very promising as a potential area for advancing cognitive computing," High said. "You could argue there’s a natural connection between artificial intelligence and the embodiment of that through robotics."

IBM Watson API Slide

Where this technology goes from here is partially up to developers. IBM will continue to improve Watson's capabilities and we're sure there will be more demonstrations to show off it's growing capabilities. But beyond dancing robots and game show contestants, IBM has also made available APIs to leverage its technology and the Watson compute platform in general.

The possibilities are both vast and exciting. It's amazing to think how far we've come since Dr. Sbaitso to now what Watson can do, in a span of a little over two decades. Just think what the next decade or two can bring. No doubt, IBM will be looking to NVIDIA's next generation Tesla P100 Pascal-based processor as well, for their next generation high performance compute engines.