The Google Assistant is gaining some new tricks, and it is applying its machine learning algorithms to your smartphone’s camera. So now, you’ll be able to pull out your smartphone and point your camera at objects and have the Assistant display relevant content based on what it “sees”.
Since your smartphone knows where you are based on your GPS location data, you can aim your camera at the front of restaurant, for example. The Assistant will be able to recognize the restaurant and provide you important information like ratings, contact information and hours of operation.
Think of it as Google Translate that has been massaged to help in more everyday situations.
Google gave the example of pointing your camera at a flower to see what type it is. Another example that got a big reaction from the crowd was the ability to capture login information from the bottom of your Wi-Fi router and automatically enter it so that you can connect to the internet.
We should also mention that Google also confirmed what was rumored earlier this week: the Google Assistant is coming to iOS devices, starting with the iPhone. While Google Assistant on iOS won’t be as feature-filled as its Android counterpart, it will be able to scan your Gmail accounts to give you things like tracking updates on your packages or control your smart home.