Google Lens will launch within Assistant on all Pixel phones in the coming weeks
Google is bringing its artificial intelligence-powered
Lens tool to all Pixel and Pixel 2 phones in the coming weeks as part of
an update to Google Assistant, the company announced today in a blog post. Lens, which was first unveiled back in May
at the company’s I/O developer conference, is an computer vision system
that lets you point your Pixel or Pixel 2 camera at an object and get
information about it in real time, as the AI-powered algorithm is
capable of recognizing real-world items.
Lens was first made available within Google Photos last month
as part of the Pixel 2 launch, and now Google says Lens will soon
arrive as a built-in feature of Google Assistant starting in the US, UK,
Australia, Canada, India, and Singapore “in the coming weeks,” the blog
post reads.
Right now, Lens won’t be able to identify everything
around you. Google says it’s best used on simple items to start. It can
identify text, for when you want to save information from business
cards, save a URL from a poster or flier, call a phone number written
down on paper, or open Google Maps with directions to a written address.
Lens can also identify notable landmarks and can pull up information
websites and media for art, books, and movies by pointing the camera at
film posters, book covers, and museum installations.
Lens also works as a more efficient barcode and QR code
scanner. Down the line, Google says Lens will only improve as it learns
more about our surroundings and becomes more adept at identifying
people, objects, and any manner of other things in the real world.
The Article was Published on : TheVerge
Post a Comment