Google has advanced artificial intelligence this week with the introduction of a new shopping function that demonstrates how clothing moves on actual human models to offer consumers a better understanding of what they may be purchasing.
The new virtual try-on feature from Google Shopping, which is a part of a larger upgrade, uses an image of a piece of clothing to simulate how it would seem when worn. This entails creating mental images of how it moves, folds, clings, stretches, etc.
Contrary to other innovations, the models aren’t AI-generated, but the technology comes into play when it comes to demonstrating how the clothing behaves on their bodies. Everything is based on proprietary technology created by the search engine giant that ‘trained’ the model using pictures of real people posing in various outfits.
The brand has employed a variety of models with a wide range of body types, complexion tones, and hair types, as well as sizes ranging from XXS to 4XL.
Brands using it for women’s tops currently only include Anthropologie, Everlane, H&M, and LOFT, and it is only currently accessible in the US. Customers can access the functionality by clicking the new “Try On” badge on Google Search.
Later this year, men’s tops and other clothing items will be added to the lineup.