Our regular analysis selects one or more news stories from fashion technology, and presents The Interline‘s take on why they matter to our global brand and retail audience – as well as what they might mean for the longer-term future of fashion. As always, this analysis is also delivered to Interline Insiders by email — and signing up continues to be the best way to get a fresh look at the fashion technology news, completely free, in your inbox
AI beauty and fashion filters: encouraging the unrealistic and creating data dilemmas
Earlier this week, short-video app users in China surpassed 1 billion for the first time. Popular apps include Douyin and competitor Kuaishou, as well as the major international name: TikTok. With billions of users worldwide, this type of content is now unquestionably one of the primary forces in social media. With this power comes significant cultural clout; short video is now driving the zeitgeist rather than just reflecting it.
Needless to say, fashion is a cornerstone of all this, and TikTok recognised its importance a couple of weeks ago, when the platform elevated fashion to being one of a small set of key content categories (alongside food, gaming and sports), which was likely a response to the approximately 113.5 billion views on fashion related hashtags. To put it another way: TikTok is no longer the new offshoot of social media, and the platform has its hooks deep into the relationship between fashion’s creators and consumers.
TikTok is also experimenting with AI on a massive scale, even though the company doesn’t appear to be labelling it as such. This week, the company rolled out its ‘Bold Glamour’ face filter and users were struck by how advanced it was compared to previous appearance-altering filters, which had an identifiably “static” look, and which would warp or break when moving or touching one’s face or hair. Using Bold Glamour, TikTok users can create an extremely convincing illusion of having an entirely different appearance – one that is subjectively “better”.
This leap forward in believability and consistency is seemingly due to Bold Glamour being powered by AI (more specifically by a Generative adversarial Network, or GAN). The use of GANs here is not necessarily groundbreaking, since the same approach to deep learning is also behind a host of other AI applications, but like ChatGPT before it, this is a prime example of advances in AI compounding quietly, behind the scenes, before arriving in consumers’ hands in a single package that invokes a universal reaction.
But in this case, that reaction is certainly not universally positive. There have been issues before with Snapchat’s filters, raised by people who were rightly concerned about the association between digital altering of appearances and body dysmorphia, but these same spectres are now becoming much more visible because the Bold Glamour filter delivers results that make those concerns even more acute. It’s an open secret, of course, that a lot of social media content is retouched (indeed, airbrushed photography now has to be labelled as such in some EU countries), but there’s a fundamental difference between altering a static asset offline, and changing someone’s appearance in real-time, with little or visible cracks. And in this sense, TikTok (and potentially the larger beauty industry) could be wading into a minefield, opening up the possibility of deepening a disconnect between how people actually look, and how they see themselves.
Fashion, at least for the moment, isn’t doing anything similar in terms of enhancing people’s appearances at a fundamental level by way of filters. However, the apparel and footwear industries are already making extensive use of body projection mapping to allow consumers to try on shoes and clothing, and it seems inevitable that AI is going to be employed (where it isn’t already) in making that virtual try-on process more seamless.
At which point brands will face a primary question: are they using technology to provide consumers with an objective mirror (a true, unaltered digital representation of their physical selves) or a subjective one (a heightened or “more flattering” version of themselves). And while this might seem like a question with an easy answer, it’s important to remember that brands have worked to flatter their shoppers for a long time, and that there’s nothing fundamentally different between placing flattering lighting in a fitting room and making a few subtle adjustments during virtual try-ons.
Again, there is today no direct analogue between beauty and fashion in terms of fashion deploying AR (AI assisted or not) to manipulate consumers’ bodies, but there is a parallel in the sense that fashion is increasingly turning to AR (and likely AI) to allow people to visualise themselves in a brand’s clothing, and to see themselves as part of its lifestyle. This is a journey that fashion has been on before, in the physical world – moving from an exclusionary lifestyle vision to an inclusive one – but there’s still the possibility that the whole thing could play out again digitally.
Lastly, but not least importantly, is the issue that arises around data, and what brands do with the biometric information they gain access to in customers’ virtual try-on process. The gold standard here should be for every scan to be fully anonymised, to only live temporarily on the brand or technology provider’s server, and to be irretrievably deleted afterwards. However, recent lawsuits suggest that bar might not be being met.
One of those lawsuits has been dismissed, but on grounds that don’t exactly address the key questions, since it pertained to eyewear try-on, which was successfully argued as falling under healthcare rather than cosmetics. It will be a lot harder to make the same healthcare-based argument for garments and non-orthotic footwear, where more brands are starting to offer an experience that consumers clearly want (virtual try-on is a popular investment area for a reason), but are also inheriting a big responsibility around data governance, data protection etc. in the process.
The same is going to be true for AR filters. As they become more prevalent in the consumer-facing space, fashion brands are entering a tricky arena where they must grapple with where corporate responsibility for people’s self-perception and the like begins and ends, and with deep questions around what they need to do to safeguard their customer’s data. These are big, culture-wide problems that fashion’s embrace of technology is making the industry an active player in – whether it wants to be or not.
Caution over curiosity: the perils of treating technology as a toy
French fashion house Coperni has made headlines this week following their Paris Fashion Week show, for the second year in a row. Following the spray-on dress stunt in September last year, the brand stepped up their intertwining of technology and fashion and brought Boston Dynamics’s first commercially available robot ‘Spot’ onto the runway, where it interacted with models.
This was unquestionably a quote-unquote moment – something Coperni specialises in creating. Putting cutting-edge robots and high fashion side-by-side seems, on the surface, like the perfect way to create a spectacle. But while the use of technology as part of traditional runway shows is being embraced as a glossy novelty, this story (and the ones before it) are also emblematic of fashion’s tendency to wade into areas that should perhaps be approached with more caution and consideration.
In this case, flirting with robotics is provocative due to the open question of automation replacing labour (a question the brand may be purposefully courting here), but there’s also a more direct implication, since Boston Dynamics have courted their own cultural controversy by testing the same model seen on the runway with Massachusetts law enforcement in 2019.
Having a robot play a pivotal role in a fashion show is fun, obviously, but robotics has a darker side – one that key players in the robot industry are working to distance themselves from in a pledge not to “weaponise” their platforms.
It may be tempting to look at concerns like these as sapping the fun out of fashion, but The Interline’s perspective is that fashion – at a whole-industry level – needs to ensure that it takes technology as seriously as it deserves to be taken. Around the same time Spot walked the runway, Los Angeles City Council was voting on (and eventually delaying) its police department’s pitch to procure its own robotic “dog” – something that detractors see as a symptom of deepening divisions between law enforcement and communities.
This should stand as a reminder that much of what the fashion industry is experimenting with right now is also inextricably tied to global questions about cutting-edge issues like AI, robotics, and biometric data. There’s no doubt that tech is essential to a lot of what fashion wants to do, but it can also come with some heavy baggage.
Investment news: science as well as software
March has seen a significant amount of funding has been invested into fashion technology companies that have a focus on science, rather than just software. First, a slew of big brands have signed onto California start-up Rubi’s pilot project to decarbonise fashion through a pioneering, carbon-negative approach to creating cellulosic textiles, – for a total of an additional $8.7 million.
This is a strategic investment at the industry level in a very different, scientific, approach to decarbonisation, tackling the issue at the point of manufacturing. At a simplified level, Rubi diverts CO2 emissions that would otherwise land in the atmosphere, converting them into pure cellulose pulp using enzymatic reactions. This pulp can then be treated as a raw fibre in a similar way to other natural and synthetic materials, to create fibres, yarn, and textiles. The process is held out as being net carbon-negative, water and land neutral, and fully traceable.
Another science-backed fashion technology company is US-based textile recycler Circ, who closed a $25 million funding round earlier this week. The Circ solution comes later in the product lifecycle, disassembling finished products (with a broad range of starting materials such as cotton, polyester, and polycotton) into their components using hydrothermal processing.
As they are only startups, it may still be some time until results from companies like Rubi and Circ are seen, and even if successful at scale, their contribution will be just a drop in the ocean needed to improve the issues that plague the global fashion industry. But it’s encouraging to see investors recognising that solving fashion’s sustainability crisis will require R&D funding at a fundamental, scientific level.
The best from The Interline:
This week we published three exclusive features, written by the Director of Digital Product Creation for a major multinational brand, Katherine Absher of Cotton Incorporated, and Tom Cowland of Foundry.
Describing fashion as “the zeitgeist of our times”, Tracey Mancenido, Director of Digital Product Creation for a major multinational brand, shares her perspective on why our industry should be rethinking the way it approaches DPC.
Katherine Absher, digital product creation specialist for Cotton Incorporated, making the case for combining the benefits of digital materials with the confidence that those fabrics can be physically produced.
In his exclusive feature, Tom Cowland imagines a future, where scalable DPC workflows integrate seamlessly with upstream and downstream processes.