Earlier this year, Google unveiled what the next big thing is that they were working on, namely perfecting the Google Assistant and its offshoot : Google Lens (among other things). The feature was shown off as excelling in an image recognition, with the demo given on stage showing off its capabilities such as recognizing objects, people and even network passwords.

While Lens made its way to the new Pixel phones, the question on everyone’s minds were when would it come to others. While the feature is still in early access, Google has made sure that the exclusivity that their Pixel lineup offers stays on. As of now, many last gen Pixel users have reported to have received the feature on their Google Photos app.

The reveal came when Reddit user LaceratedCantaloupe found the Lens had rolled out to his photos app seemingly without an official update, confirming that it is indeed a server-side update. We immediately checked on our own 2016 Pixel XL model and indeed were greeted with a ‘Preview’ screen :While we know that the feature is still in development, we didn’t expect it to work as flawlessly as it was shown at the I/O demo. As of now the feature can only be accessed from the Google Photos app, with support for built in integration with the Google Assistant coming soon.

One must remember though, that it isn’t the first time we’re seeing a feature of this kind. Earlier this year, we found out about Samsung’s own virtual assistant, Bixby for its Galaxy lineup. Bixby also came with its own image recognition feature, namely Bixby vision. While the initial hate towards Bixby has considerably lowered, there were some who appreciated Bixby vision, wondering what would happen if Google were to adopt this technology.

The thing is, Google already has done this in the past, albeit much differently. Anyone who has used Google Goggles in the past would know that it is pretty much a toned down version of Lens. While Goggles didn’t really take off, Lens might.

Google Lens can, as of now, read and infer from any text that it might find in your photos, perfectly recognizing stuff such as movie posters, song albums, or even your pets. Here are a few examples :

While it’s clear that Google has a long way to go before the feature comes of any actual use as shown in the demos and the promises made by the company, this first step is appreciated and we can only hope to see it improve. With all the advancements Google is making with use of AI, we hope Apple takes a page out of their books and implements something similar to finally make Siri, and on the whole, the iOS ecosystem, better.

Google Lens will be available to access natively from the Google Assistant in the coming weeks, with rollout to other devices coming presumably after the ‘Preview’ period is over. We’ll be covering the feature more extensively as it develops.

Leave a Reply