A woman is shopping for jeans. She’s 4’11” and on the heavier side. Finding a pair that fits her waist comfortably without needing to be hemmed six inches is a nearly impossible task even in a store, where she can try on pair after pair — let alone online, where she essentially has to guess whether the jeans will fit, and then go through the rigmarole of returning them if they don’t.
Another woman is 5’11” and slim. Every “short dress” she tries on fits her like a long shirt. By now, she knows better than to believe the dress will look on her the way it looks on the mannequin — let alone a model in an online photo, whose measurements are unknown and may have even been photoshopped.
These two women have different body types, but the same problem — and it’s shared by any customer brave enough to shop for clothing online. It’s a gamble, and a big headache when it doesn’t work out.
There are a few innovators trying to solve the problems of fashion fit and style, both online and in physical or omnichannel environments.
Fashion Tech
Take, for instance, the various incarnations of smart mirrors that debuted at the recent International Consumer Electronics Show (CES) in Las Vegas. These inventions use technologies like 4D imaging and touchless interaction to guide users through makeup tutorials or provide beauty regimen advice.
Even before CES, some stores were adding magic mirrors to fitting rooms, enabling customers to see how clothing would look in different lighting or in another color, providing a 360-degree view and connecting the shopper to a store associate if a different size or color was needed.
Then there’s Amazon’s Alexa-powered Look device, which has a built-in camera to help users choose and assess an outfit. They can also take pictures or short videos of multiple outfits and submit them to Amazon’s judging panel to select the best one. A companion mobile app enables tagging of favorites, style comparisons and, of course, shopping recommendations.
A Different Approach
Perhaps counterintuitively, Boston-based tech company True Fit took a non-visual approach to this challenge, instead opting to leverage data from both consumers and brands to suggest products that are likely to match not only the shopper’s body shape and size, but her personal style as well.
True Fit Co-Founder Romney Evans explained how True Fit is able to do all that without ever seeing a picture of the customer — and why, in his opinion, it’s better that way.
First, said Evans, the platform grabs data from the brand behind the product. This includes technical specifications, such as fabrication, neck type and more than 100 other attributes that are used to categorize the garment’s fit and style.
Then, True Fit grabs data from the user during onboarding, including height, weight and age. But the real key is the reference item, Evans said. Customers are asked to share a garment they already own that fits them perfectly.
If that garment is, say, a pair of Michael Kors jeans in size 28, all that data True Fit collected from manufacturers enables the platform to know exactly what that garment says about the customer’s body type and how that will translate to garments from other brands.
“Digital commerce has increased, and with that, the challenge of fit and sizing has become even more pronounced,” Evans said. “Trying clothes on at a store is hard too, but online it’s just a guessing game.”
Evans said that a data-driven approach benefits participants at every level of the transaction. Customers experience more confidence and less friction. That means the retailer sees more sales and fewer returns. And the brand has the benefit of matching its products to people who are likely to have a positive fit experience, making them more likely to shop with that brand again.
Innovation Trajectory
Evans doesn’t think that smart mirrors, imaging, augmented reality and other visual tech solutions are a bad way to approach the fashion problem — he sees those technologies as partners and believes it’s possible to create a “magical” experience for users.
However, he said it’s still early days for such technology, and more progress will be needed before that potential can come to fruition in a way that resonates with consumers.
“There are a lot of challenges inherent in visualizing clothes,” Evans said. “People are self-conscious about their bodies. There’s a certain level of performance and realism required, and the tech is not quite there yet. Therefore, it’s not satisfying to customers and doesn’t help answer the question of whether to buy the product.”
He feels that richer data — of the kind that True Fit gathers — could inform these visual shopping technologies going forward to take them beyond the “paper doll” stage into true visualization.
“Fashion is a unique category,” Evans concluded. “It’s deeply personal by nature; it’s a form of expression and communication. When it comes to distinguishing oneself, details matter.”