Google recently announced that the Lens feature will soon roll out for all Pixel smartphones, which allows users to learn more about an object by scanning it with their device’s camera in real-time. A report by Android Authority claims that the feature has started rolling out for all Pixel users in the US, UK, Australia, Canada, India, and Singapore, but only for devices with their language settings set to English.
Pixel owners who have received the update can access the feature by booting the Google Assistant and tapping the Lens icon in the bottom right corner. It will launch the camera viewfinder and tapping anywhere on the screen scans the object in front of the smartphone’s camera. After scanning the object, Google Assistant displays the possible results with a carousel of cards beneath it allowing users to perform a web search, open other apps, and more.
Google Lens is currently able to understand and recognise many different objects like Text, Landmarks, Art, Books, Movies, and Barcodes. It also allows users to connect to a Wi-Fi network by scanning the password on the back of a router. Google earlier said it was working on making its Assistant less fragmented and more natural sounding and has also been updating the app with new features. The company recently updated its intelligent assistant recognise songs playing nearby. Users can ask the Assistant, “what song is playing?” or "what song is this?" and it will display the song’s name along with its artist name and lyrics.
Google Assistant can also set daily subscriptions which allows users to set reminders for weather updates, quotes, facts and more. Simply launching the Assistant and saying “send me a quote everyday” or “send me a mindfulness tip everyday” will set the daily reminder. You can read more about the feature here.