Google is slowly rolling out its machine learning Lens Google Assistant and camera feature to Pixel 2 and original Pixel handsets across the globe.
Google Lens was first shown off with the Pixel 2, whereby the smart feature uses computer vision to recognise items and places the phone’s camera is seeing or within an image a person is viewing and serve up information about it.
For example, snap a picture of a book cover and Lens will serve a snipped of information about the author and the book.
But Lens only featured in the camera app of the Pixel 2 handsets, so could only be used with photos the phone snapped rather than other images via the Google Assistant. Google did note Lens will come to its virtual assistant but didn’t reveal when.
However, Google now appears to be quietly slipping Lens into the Google Assistant, as according to Reddit user DHatch207, Google Lens showed up in the Google Assistant on their original Pixel handset running the Android Oreo 8.1 beta.
“No need to take a picture to use it, it popped up with it’s own viewfinder,” said Dhatch207, who posted a picture showing the Lens feature as a small icon in the bottom right of the Google Assistant panel.
The roll out appears to be happening at a relative sedate pace, as only a few other tech savvy Redditors noted the update had appeared on their Pixel and Pixel 2 handsets.
There’s no word on whether Lens will be pushed out to other Android phones, but there’s a good chance that the smart feature will extend its reach much like the Google Assistant did after its initial launch back in 2016.
Related: Best Black Friday deals
What smart features would you like to see Google put into the Assistant? Let us know on Twitter or Facebook.