A lot has been accomplished with Google's DeepMind artificial intelligence subsidiary, and it looks like the progress train isn't about to slow down anytime soon. As a quick recap, this year alone we've seen DeepMind take down a Go player, get started on an AI kill switch, attempt to cure blindness, improve speech generation, and even slash a mammoth power bill. What's next? Completely self-contained learning.
We wouldn't be shocked if you started having ideas of Skynet running through your head right now. With a new hybrid system called Differentiable Neural Computer (DNC), a neural network is paired with external storage (aka: a large external dataset), and the computer's AI is smart enough to pore over the data it's given to create its own connections using what's stored in memory.
In a video example provided, DeepMind establishes who a handful of family members are, and after enough data has been learned, the system is able to figure out information it wasn't explicitly told; in this particular case, who an individual's great uncle is.
This might seem like standard fare given that we've seen other examples of machine-learning over the past few years, but what makes this different is that DeepMind is using its own memory to help store and process information -- much like our brains do. With information stored in memory, DeepMind can begin to figure out patterns and give us answers with scary accuracy.
The uses for this kind of self-learning could be huge. Another example provided is with London's subway system; one that's actually quite complex and that millions make use of it each day. With a computerized brain churning through all of the possibilities, we could very well see a solution like DeepMind in the future be able to more accurately recommend route paths and differences over the course of a day to tell us when the best possible time to hop on would be.
As with most things involving deep-learning, the sky
net is really the limit and this latest development from Google just keeps pushing the boundaries.