Google will quickly be extending the attain of Google Lens, its visible search interface. In a weblog submit, the corporate introduced Lens can be built-in into the Google Assistant within the coming weeks. The characteristic remains to be unique to Pixel telephones, however now it must be lots simpler to entry.
Google Lens got here out in beta on the Google Pixel 2, which launched final month. The service is mainly a revamp of Google Goggles—you are taking an image of one thing, run it by means of Google’s laptop imaginative and prescient algorithms, and Google will attempt to inform you what’s within the image. Google says Lens can determine textual content, landmarks, and media covers, however these have been all issues Goggles may do years in the past. We tried Lens on the Pixel 2 at launch, and whereas it was undoubtedly a beta with numerous issues, it sometimes did one thing spectacular, like recognizing not simply image contained a canine, but in addition nailing the canine breed.
Google says Assistant integration will let you get “fast assist with what you see.” This appears like an enormous enchancment over the present beta of Google Lens, which is simply built-in into Google Pictures. Doing any form of recognition by means of the Pictures app is de facto gradual, since you need to open the digicam app, intention it at one thing, take an image, open the image, after which run it by means of Lens. The brand new location of Lens will probably be lots simpler—you simply open the Assistant and faucet on the Lens icon within the backside proper nook.
Google says the Lens-in-Assistant integration will probably be coming to “Pixel telephones set to English within the US, UK, Australia, Canada, India and Singapore over the approaching weeks.”