Google is overhauling Lens, its image search technology, reportedly adding so-called filters to the technology.
According to a report in 9To5Google, the publication was able to decompile the latest version of Google Lens uploaded to the PlayStore, showing lines of code that hint at what is to come. While Google may never include it in an update, it does provide a glimpse into where Google may be heading. In the case of Lens, to which Google added real-time recognition, smart text selection, and native camera app integration last year, it could now add translate, dining and shopping filters. According to the report based on the string of code, the fiver filters appear at the bottom of the screen when launched, enabling users to access specific visualization tools. The report noted the Translate filter will likely enable users to translate foreign text and will be more advanced than the current technology that recognizes text and provides users with access to Google’s Translation App.
Another line of code could enable users to take action based on a text selection, whether it is an address or numbers, while the dining feature could be similar to Google Maps AR. Users may be able to point the camera on their phone and see restaurants nearby. While adding features to a tool that is supposed to serve up information simply by pointing your camera doesn’t seem to make sense, adding the filters can provide more accurate results for users, speculated the report. 9To5Mac noted that while it’s not clear when filters will come to Google Lens, it is a good bet it could happen at its developer conference this year.
Ever since Google rolled out its image search technology it has been using it to make other aspects of search more useful. In October it announced it was bringing Lens to Google Images. In a blog post at the time Assaf Broitman, product manager of Google Images, said when a user sees something in an image and wants to know more, they can use Lens to get more information about the image. Similar to how Lens works with Google Assistant and Google Photos, Lens in Google Images identifies things within the image and will also show the user a similar one. The executive said that once they press the Lens button in Google Images, users will see dots appear on objects that they can then learn more about.