Fashion, at its core, is a social experience, with people sharing their favorite looks and trying to replicate outfits worn by friends, celebrities and style icons. But often, it’s difficult to find the right item to complete the look because consumers don’t know the right terms to use in their searches.
As artificial intelligence (AI) has improved, though, consumers no longer have to figure out the perfect way to describe that particular blazer they saw once — many retailers are now adopting visual search, using technology to identify the relevant items a shopper might be seeking.
Farfetch, for instance, has been offering the feature since 2019 through a partnership with Syte, a visual AI Software-as-a-Service (SaaS) company. Using the luxury marketplace’s app, customers can take and upload photos through the search bar, with the software identifying which relevant items are pictured. The app then offers several interactive tags, such as “jacket,” “dress” and “bowtie,” which can be used to search Farfetch for similar merchandise.
Lihi Pinto Fryman, co-founder and chief marketing officer for Syte, told PYMNTS in a 2019 interview that visual search makes the shopping experience more seamless and user-friendly by removing uncertainty and allowing brands to align results as closely as possible to a customer’s vision. “[Users do not] have to explain what they’re looking for, or what made them fall in love with that item,” Fryman said. “All they need to do is upload an image and find their inspiration.”
Read more: AI-Powered Visual Shopping Experiences for Millennials, Gen Z
Fryman noted that Farfetch and other retailers that use visual search may not be able to provide the exact same product, “because maybe it’s something that an influencer is wearing and [it] costs $5,000,” but it gives the merchant a chance to display similar items. “It’s really connecting the [product] inspiration from social with the retailer’s collection,” she added.
Up Against Google
Startups such as Syte are now up against search giant Google, which earlier this year leveraged its Google Lens image recognition technology with Google’s Shopping Graph database to offer visual search capabilities. Using the Google Lens app, consumers can select items in photos to see details and shop for similar products. The company also enhanced its visual search results to create more experiential interactions on-screen, delivering more comprehensive results and a more enjoyable shopping journey.
Also see: Google Shopping Revamp Expands Opportunities for Merchants to Turn Searches Into Sales
Matt Madrigal, VP/GM of merchant shopping at Google, told Karen Webster ahead of the feature’s launch in September that the goal is to create “a visual feed of products … interspersed with video content and style guides.”
“With this change … we’re making it easier to window-shop your favorite brands and discover new brands, and at the same time, help more brands get discovered,” he said.
Google Cloud also recently rolled out a new Retail Search product built on Google’s decades of search experience in an effort to help retailers enhance consumer experiences with personalized results and relevant promotions.
Related news: Google Cloud Retail Search Aims to Solve $300B Abandonment Problem