At its Google I/O conference held this past May, the search and technology giant unveiled a neat new feature which would first roll out to its own Pixel devices. Called Google Lens, this new feature pays a bit of an homage to Google Goggles, a rather popular tool that allowed people to get quick feedback on real-world objects. Take a picture of a statue outside of a sports stadium, for example, and Goggles could have told you who it was, and why it was deserved.
Google Lens is going to be far more capable than Goggles, though, thanks in part to the recent resurgence of employing deep-learning and AI to bolster our handset's brains. Currently, it appears most computation is done off of the device, but with AI processors finding their way into current and future smartphones (Apple's A11 Bionic has a 'neural engine', and Qualcomm has been working on AI chips for a little while), it seems likely that at some point soon enough, our smartphones will play a more integral role.
Right now, Google is rolling out the Assistant tie-in to Lens to Pixel and Pixel 2 owners, so if you own one of these devices (read our review of the PIxel 2 and 2 XL), you can check for updates and see if there's one available for your device. If not, and in the words of Ace Ventura, "just wait longer".
With Assistant using Google Lens, it means that upon saying "OK Google", and tapping the camera button, you can immediately take advantage of lens in different ways. Point your camera at an unusual object, and let your phone tell you all about it. Fortunately, users won't have to first take a photo of an object with the camera, but instead the technology lets Google Assistant capture it in-line, saving time, and making for a more fluid experience.
Google Lens Example (Credit: 9TO5Google)
At this point in time, this kind of feature might feel like a cool tech demo, but it has real uses. Envision hanging out with a friend who's reading a book, and you get to talking about it. Simply by using Lens to scan the cover, you could purchase it within seconds, or look up related media related to it (news, video, etc). A previous example given includes using Lens to scan the Wi-Fi label on a router, noting the name and password, and then allowing the phone to use the info to sign on immediately. Maybe such a design would encourage router makers to set actually difficult default passwords, though the feature could prove useful in free Wi-Fi service areas like coffee shops.
Whatever you want to use Google Lens for, it sounds like a pretty cool feature, and one every non-Pixel owner is going to be waiting anxiously for. Hopefully it will not take too long to pop up on other devices.