Every week, The Interline analyses one or more vital talking points from across the landscape of fashion technology news. This analysis is also delivered to Interline Insiders by email.

Heralding serious shifts in logistics and interactive consumer experiences, robotics and real-time rendering are quietly reaching major milestones.

Ask almost anyone what technologies are going to define the future, and they’ll answer with some combination of automation and virtual worlds. Those two answers have, in fact, been the go-to shorthands for futurism for decades; ever since the first industrial robots set up shop on the assembly line, and the first videogames appeared on arcade and home displays, people have wrung their hands over automation replacing human labour on a grand scale, and over the spectre of a generation disappearing into a digital realm.

Those fears have always been premature. While robots have certainly replaced a lot of repetitive manual assembly tasks in many different industries, much skilled work is still undertaken by humans. And while videogames are inarguably the biggest entertainment medium there is, so far a cohesive metaverse that threatens to make people forget the real world in huge numbers has yet to manifest.

But behind the extreme interpretations of the possibilities and the potential drawbacks of these two forces, a huge amount of progress is being made. And occasionally snapshots of that progress will percolate in a way that makes the advances being quietly made visible to people who don’t work directly in those sectors.


As it happens, two such instances took place this week – showcases of technological progress that gave the team at The Interline pause, and that reveal just how far these two very different areas of endeavour have come in what feels like an incredibly short span of time.

As media channels continue to cross-pollinate and merge, it’s no secret that every industry wants to find a way into videogames. From brand collaborations with popular multiplayer titles and their streamers and audiences, to physical collections bearing gaming icons, fashion is taking gaming more seriously than ever. But videogames really are something of a trojan horse; while fashion concentrates on making the most of its interactions with pre-existing franchises, the latest iterations of the real-time rendering engines on which those franchises are built are being used to transformative effect in other industries.

Take the video below, for example. Built on the latest version of Unreal Engine (UE5, now available in early access) by a single person, this showcase demonstrates the significant strides that have been made in the use of real-time engines for architectural visualisation – a discipline that has more in common with fashion than some people care to admit.

The most obvious yardstick to judge this video by is its photorealism. It certainly isn’t perfect (people with experience of real-time environment design will notice cube maps where ray traced reflections should be), but to a casual observer the level of fidelity achieved here is close enough to reality to pass scrutiny. And it’s important to note that everything shown off in the video is being rendered entirely in real-time, on consumer-grade hardware. It may be quite expensive consumer-grade hardware, but the fact remains that rendering believable environments, with global illumination and physically based materials (the two primary contributors to how real this showcase looks) is now eminently achievable. And as The Interline has written before, reliance on local compute and graphics horsepower is likely to be reduced as both gaming and other real-time applications make deeper use of cloud streaming.

The second metric by which this video should be judged is its interactivity. Unlike traditional 3D CAD fly-throughs – which are how architecture and interior design was traditionally visualised – and offline (static) renders, several sets of materials in the scene the environment artist has (dviz) has created can have their properties manipulated in real-time. In this case that capability is demonstrated by the user changing the colours of sofas and kitchen units to their exact specifications, but the same principles apply to the substitution of one floor tile for another, or the swapping out of furniture.


And when these capabilities are combined with the democratisation of photogrammetry and object capture we have previously covered, it’s easy to conceive of an architect or interior designer offering their clients a near-photoreal playground to experience a potential project – one that includes configuration from a pre-digitised catalogue of options, as well as more granular customisation of colours and materials. If we consider also integrating these consumer-facing real-time experiences with wider material and asset libraries, then the possibility space opens up even further.

The applications of this same category of real-time visualisation and interactivity for fashion should be clear. Brands are already building out virtual spaces for shoppers to explore, and digitising their wholesale catalogues, but the level of fidelity and interactivity of these spaces could now be improved several-fold with the right skillsets and the right applications of advances in real-time rendering.

From a commercial point of view, fashion’s current fixation on gaming makes a great deal of sense. But from a longer-term creative perspective, and from the angle of wanting to deliver transformative experiences to in-house, partner, and consumer audiences, the potential of real-time rendering is largely going under-used. And as this application for architecture demonstrates, that potential has come a long way very quickly.

And the same can be said for the impact that robotics, combined with machine learning, are having on on warehousing and logistics. As the below video (from prominent YouTube popular scientist Tom Scott) shows, so much of what we mentally file away as manual work – picking and packing groceries – is already being automated.

This video is especially impressive because, like a lot of people, The Interline experiences only the last mile of the grocery delivery network. Food arrives at our door in baskets carried by a human being, so the natural assumption is that the majority of the previous miles of that journey are similar labour-intensive. In fact, as this video reveals, automation is now being deployed to such an extent that entire distribution hubs are being built without human workers in mind at all.

And as above, the implications for fashion are obvious: as shopping channels continue to blur, and as the requirement to bring product to the consumer rapidly intensifies, the ability to predict what a shopper (or a cohort of shoppers) is going to order, and to apply automated robots to the task of picking that order and assigning it to the right retail network, could go even further to change the way we think about distribution.

While these two examples may seem disconnected, they represent a very real and universal trend: the ability for transformative technologies to make significant advances out of sight and out of mind, so that when glimpses of the progress they’ve made reach a broader audience, they feel almost impossibly far ahead of the mental benchmarks we had.

And if fashion is to learn anything from the pandemic-catalysed rush to digitisation, it’s that technology (and the other industries that use it) is often much further ahead than we might expect.