It’s no secret that Google is heavily invested in machine learning algorithms for use in its vast software portfolio. Products like Street View, Inbox Smart Reply and even voice search rely heavily on machine learning to deliver customers the data they need in quick fashion.
But behind what may seem like rather mundane software on the surface is some incredibly powerful hardware. In Google’s case, the company has been using its own custom ASIC for the past year which it calls the Tensor Processing Unit (TPU) to power its machine learning efforts.
If you want to know just how powerful Google’s TPU is, then look no further than the famed AlphaGo computer that bested Go champion Lee Sedol on multiple occasions during the DeepMind AI Challenge. Sedol is one of the most decorated [human] Go players in the world and even he proved to be no match for a machine that performed in “such a perfect manner.”
“[We] have found [TPUs] to deliver an order of magnitude better-optimized performance per watt for machine learning,” explained Distinguished Google Hardware Engineer Norm Jouppi this afternoon in a blog post. “This is roughly equivalent to fast-forwarding technology about seven years into the future (three generations of Moore’s Law).”
According to Google, the TPUs — which just so happen to be compact enough to fit into a single hard drive slot in Google’s data center racks — have been specifically designed to “squeeze more operations per second into the silicon” so that “users get more intelligent results more rapidly.” Given how reliant Google’s products have become on machine learning (just take a look at the recently revealed Google Home Wi-Fi speaker), this custom hardware is set to really change the way we interact with Google services for the better.
“Machine Learning is transforming how developers build intelligent applications that benefit customers and consumers, and we're excited to see the possibilities come to life,” added Jouppi.