The Latest Version of Google Photos Lets You Play Around with Google Lens

September 16, 2017, By Sanjeev Ramachandran

Google Lens was one of the major surprises Google pulled off its sleeve during its I/O conference this year. In short, Lens is the advanced version of Google Goggles, letting you pick up info from any image using the neural network capabilities of Google.

Back during its launch, it was evident that the new feature would come shoulder in shoulder with Google Assistant. Now, Google has kick-started the final testing before it becomes official, and it’s being done by introducing the feature to Google Photos app.

The latest version of Google Photos now lets developers test the advanced machine learning features of Google Lens. Accordingly, users can capture live images to be sent them to the neural network for retrieving contextual information from the images. Those include picking up an event to be added to a calendar from posters or information, or just searching for other details related to buildings, places or just anything you come across.

Testing results according to those from developers have been impressive, with Lens missing out only on a scale of one wrong for nearly ten right results. That’s more than impressive, given that Lens is still learning.

The developer version also provides access to pick up any image from their gallery to be eyed through the Lens, and not just those directly fed from the device camera. That just shows that Google is ready to roll out the feature in its full form at once. The teardown of the app also reveals other functionalities including what all can be scanned, and its ability to detect URLs, contacts and much more.

© 2008-2012 - All rights reserved | Privacy Policy