During the Google I/O 2017, Google introduced a lot of feature, and an AI in your Mobile Phone camera, you do more than you think!
Are you looking at something but you do not have necessary details about it? A newly introduced AI called Google Lens helps you to get more knowledge about that! This is 21st Century, everything is going AI!
Google Lens is going to work hand-in-hand with Google Assistant, using AI capabilities to identify objects in the world around you and project actionable information and about those objects onto your screen.
With Lens, Google’s Assistant will be able to identify objects in the world around you and perform actions based on Google’s various apps.
For example, Lens can help you identify a flower you’re looking at using Google image search, or provide you with restaurant recommendations based on your location in Google Maps.
However, it is reportedly being integrated into both Google Assistant and Google Photos, but just like the object recognition feature, we have no idea when exactly it’s going to be added or how it’ll work when it does.
With Google Lens, your smartphone camera won’t just see what you see, but will also understand what you see to help you take action. #io17 pic.twitter.com/viOmWFjqk1
— Google (@Google) May 17, 2017
The service seems very ambitious and versatile. Scott Huffman, Vice President of Engineering for Assistant, mentioned using Lens to translate a Japanese street sign and using it to identify a food he didn’t recognize.
Huffman said Lens facilitates a conversation between the user and Assistant using visual context and machine learning.