Using Virtual Reality To Wedge Open The Shrinking Window For Creative Expression In 3D

Key Takeaways:

  • Digital transformation has expanded the development pipeline but paradoxically compressed the creative one, forcing engineering decisions into the earliest phases of ideation. VR offers a necessary counter-movement, restoring the “sculptural” intuition of design by allowing creatives to resolve volume and proportion in space before the rigours of manufacturing logic take over.
  • Traditional workflows that translate 2D sketches into 3D models often degenerate into a cycle of fixing impossible geometry or cheating proportions. “Immersive sketching” creates spatially accurate assets from inception, shifting the downstream CAD workflow from one of repair to one of refinement—a change that can reduce sampling cycles by up to 66%.
  • The industry’s hesitation to adopt VR is less about hardware limitations and more about a lack of software interoperability. Because VR models currently lack the “garment logic” and physics data required by downstream platforms like CLO or Browzwear, scaling immersive design requires treating it not as a replacement for CAD, but as a distinct, upstream data source that feeds cleaner intent into the stack.

Fashion has spent the last decade perfecting its digital tools. But even though that process has successfully expanded the canvas for whole-lifecycle design and development, on aggregate this roll-out has also, kind of paradoxically, narrowed the spaces where creativity begins.

There’s no denying that platforms like CLO and Browzwear have transformed accuracy, efficiency and sustainability across the product lifecycle. These platforms, and others, are now the essential foundations of digital product creation as a complete ecosystem, yet the more precise these systems have become, the earlier designers are being asked to commit to construction logic, seam placement and proportion. As digital transformation has rolled onwards, the already-small window for free, creative expression has started to shrink even further. Technical decisions now need to be made at a point in the process that once belonged to unfettered exploration, and the instinctive, sculptural part of design has become harder to access within screen-based workflows because, at least in part, digital product creation has brought the rigours of engineering forwards to where they coincide with the space that used to be reserved for creativity.

studioofnew

Is there a way to claim some of that back, whilst still holding onto the benefits of DPC? I think so: and I think the answer could be virtual reality.

This raises an exciting prospect: if VR can revolutionise vehicle prototyping and performance footwear, why not garment design? As fashion and gaming converge through shared assets and real-time 3D environments, immersive creation may soon become a standard part of digital design, opening up new horizons for the industry.

The Persistence of Two-Dimensional Logic

During the pandemic, virtual reality had a brief “moment”, with immersive catwalks and interactive showrooms generating headlines as substitutes for things we used to do in the world. When we all emerged from lockdown, though, VR wound up being sidelined as a temporary fad, or an optional storytelling tool – at least in fashion. Very few apparel brands carried on using VR as anything more than a passive viewing system.

By contrast, other industries like automotive, architecture and performance footwear took a divergent approach and continued to evolve VR as a valuable design tool, using it for spatial testing of early decisions, before precision was required. Designers in those industries carried on working in VR as a way to experiment with form, primitives, ideas, and experiments that could then become part of the digital lifecycle of products thereafter.

Why did this happen? Why did other industries successfully take on immersive creation as a cornerstone of digital design? And why did fashion turn away from it?

studioofnew

Despite the realistic garments 3D software can create, the answer, I believe, is that underlying fashion workflows were too rooted in 2D. Even when people work in 3D for patternmaking, fitting, and garment engineering, designs typically start as flat patterns wrapped onto a digital avatar, and get adjusted and re-simulated until they behave correctly on screen. Even with on-avatar sketching, designers often find themselves estimating depth, scale and balance (the parts of 3D that have no exact analogue in flat-plane design) rather than experiencing them. And while digital blocks provide consistency, any inaccuracy in a base block can quietly carry through an entire collection, so the challenges of getting 3D wrong from first principles then cascaded out across the extended product lifecycle.

The challenges also go beyond just the blocks. CAD viewports often create unrealistic simulations: fabric appears to stretch far beyond what would happen in reality, proportions seem stable until they are sampled, and minor distortions through drafting typically reveal themselves only later in the process. When the foundational block is either missing or inaccurate, these issues quickly accumulate.

studioofnew

The 2D design cycle trap became particularly evident during a project I worked on with SCIMM, Decathlon, and the Antwerp Giants basketball team back in April of 2025. (For more on Decathlon’s work in 3D and DPC in general, refer to the DPC Report 2024– Editor) To give you some context, SCIMM had previously delivered custom-fitted uniforms to players through a body-scanning pilot project aimed at creating better-fitting apparel. They intended to market this initiative, but faced a major setback: the original digital files were missing. This left them with tangible results but no means to showcase their creation beyond a few photographs of the physical garments, a true marketing nightmare. 

I was brought on board to help recreate the pieces using VR technology. If we had tried to recreate those pieces through a traditional CAD workflow (without the original blocks), we would have encountered all the limitations of a 2D-centric process: extensive guesswork, repeated simulations and a significant risk of deviating from the original fit. SCIMM’s goal was for the digital uniforms to match the custom fit as closely as possible, which meant we needed to rebuild the garment logic from the ground up. Fortunately, VR made this possible primarily because it represents reality in a way that flatscreen 3D can’t. By working with the body-scanned avatars of the athletes, I could construct new 3D blocks directly on the avatars, enabling a far more accurate reconstruction of fit, proportions and panel logic than would have been possible with traditional screens alone. Once the spatial structure was established, the garments were moved into CLO for refinement. What would have typically taken weeks of trial and error was completed in just a matter of days.

It’s important to note that VR doesn’t replace pattern logic, nor does it eliminate the need for technical refinement, nor remove any of the skills that 3D designers and developers have honed. What it does is restore the spatial understanding that screen-based tools can obscure. It brings creative intent and proportion back into CAD – something that I think has the power to strengthen the entire workflow from the outset.

So, if VR can restore spatial understanding before a garment reaches what we think of as the “CAD stage” today, the next question is: how should ideas begin? Sketching has always been fashion’s most instinctive stage: fast, expressive and unburdened by technical constraints. Yet in digital workflows, it often becomes a point of friction because it represents one of the most obvious disconnects between what logic dictates should be the “right way” to design fashion, as a physical ting occupying space, and the reality of creative workflows and tool. An idea is drawn in 2D, tested in 3D, sampled and then refined again in CAD. By the time the design reaches maturity, much of its original energy has already been reshaped by a process of iteration, translation, and re-translation. We draw in 2D with a clear intent in mind, only for that intent to be slowly squeezed out of the process because it only existed in 2D, and the rest of the lifecycle demands 3D.

studioofnew

Immersive sketching, of the kind I’ve been advocating for people to practice in VR, bridges the gap between design and execution by allowing designers to create at full scale around an avatar, as seen in the Decathlon work. This method provides immediate volume and intent, making the process feel more like sculpting and less like drawing on paper that someone else is then going to try and re-interpret into a 3D model. In VR, it’s common to quickly produce numerous silhouettes without the constraints of traditional tools, allowing you to sketch well over 100 different designs within just a few hours – designs that are all 3D by their very  nature, even if they require additional work in dedicated 3D design and simulation tools before they become part of the DPC ecosystem.

Most importantly, fit is not sacrificed in the process. VR sketching works with 3D pattern blocks (skeletons), that surround the avatar and maintain correct proportions from the outset. Just like 2D blocks, these 3D blocks provide structure to creative freedom, ensuring that silhouettes developed in VR do not unravel when they are moved into CAD. Instead of guessing fit from a flat sketch, designers begin with a spatially accurate foundation that remains reliable throughout the workflow.

The layered dress I developed specifically for this report is a clear example. Starting as loose sketches around an avatar, silhouette and colour was explored, from there the form was modelled in layers using the 3D block as a template, accessories created and only then taken into CLO for technical refinement. The entire process, from first sketch to a complete digital garment, took four hours. And because the proportions were established in space rather than on paper, the CAD stage focused on refinement rather than repair. This approach doesn’t replace technical skills (mine or anyone else’s); it prioritises creative exploration first, ensuring that ideas are infused with clarity and spatial logic as they move into the technical phase of their lifecycle – something that doesn’t happen when those ideas begin life in 2D.

studioofnew

Experimentation: Efficiency and Waste Reduction

One of the most compelling strengths of immersive design is the speed at which ideas can progress before they enter a simulation environment. CAD tools excel at technical accuracy, but they also depend on detailed materials, shaders and what I often call ‘digital stickers’ to communicate colour and surface detail. Any adjustment: a new print, an altered hue, a shifted placement etc. requires reassigning assets and re-rendering to get accurate visuals. It’s ideal for refinement and development, but naturally slower for early-stage exploration. 

VR, on the other hand, allows ideas to evolve quickly, potentially saving you valuable time in your work, but without needing to sacrifice the “3D” element, and while retaining the ability to instantly test colour, print, placement and silhouette. Reference images, AI art, and textures can be integrated directly into garments, facilitating fluid iteration and experimentation without losing creative momentum.

The layered dress I developed for the report highlights this process. After establishing the silhouette in VR, I quickly explored various print and trim options, including AI-generated motifs, ultimately choosing custom designed butterflies as accessories. These were exported out to Blender to join material colours and easily integrated into CLO. This speed not only enhances approval time, by resolving silhouette, surface and colour choices in VR before going to CAD, you can reduce sampling by up to 66% and decrease pipelines from three weeks to just three days, all thanks to the clarity gained during the exploratory phase.

Collaboration and Real-Time Co-Creation

Traditional digital product development still relies heavily on back-and-forth communication. Files are exported, reviewed and returned with notes; screenshots move through inboxes; and time zones can stretch simple decisions across days. The process works, but it rarely feels very fluid. More importantly, it doesn’t always feel collaborative, at least not in a creative sense, because what people are looking at is not the working file, but rather a 2D export of it.

studioofnew

Immersive environments offer a different slant on collaboration, by enabling teams to work together in a shared virtual space, making real-time design modifications, on the same object, at the same time. For example, a designer in Paris and a developer in Seoul can collaborate on a digital outfit side by side, while external stakeholders join the conversation via Teams or Zoom without needing a headset.

In my partnership with Copper Candle to create Fortnite outfits for artists like Noelle and Michael Aldag, we aimed to showcase how VR can facilitate more intuitive design, giving artists a clearer voice in their design views and enabling remote working, speeding up the design process. Even though the artists didn’t have formal fashion training, they instinctively guided the design process, resulting in quick adjustments to fit and colour. This approach made their stage outfits feel more personal and a true reflection of their identities. Likewise, during my jewellery collaboration with NYC artist Jonathan Cuji, we communicated, brainstormed, built and designed Amazon Rainforest inspired jewellery in real time in VR, even with a six-hour time zone difference between us. 

Why Virtual Reality Has Not Yet Scaled

So, If VR can deliver this level of speed and spatial clarity, and if specific brands have successfully incorporated it into their workflows, why isn’t it already a standard part of the design pipeline? 

In my experience, the hesitation is less about the technology itself and more about the systems that surround it. Fashion’s digital tools operate within closed or semi-closed ecosystems, and fashion 3D CAD platforms rely on proprietary files, simulation engines and construction-aware metadata. When VR models are brough into this environment, they arrive without embedded garment logic, meaning it’s more complex to get fabric or texture simulation working the way that people expect 3D to work out of the box today. In practice, this creates extra work before they a design that started its life in VR can actually be used in the same way that a current 3D asset can. 

studioofnew

Material behaviour adds its own complexity. CAD tools simulate cloth physics with impressive precision; VR tools, built primarily for modelling and ideation, focus on form rather than gravity, meaning no fabric simulation at all. For technical designers, this can feel limiting, while for creatives, it can feel liberating, either way, it reinforces VR’s role upstream, at the point where silhouette, proportion and direction are set, and before technical accuracy takes over. In practice, this means the handoff between VR design and established 3D work is both more streamlined than it is in purely 2D-to-3D workflows, but more demanding in the one respect that it places a disconnect between the fragmented DPC ecosystem and the VR tools that exist today.

Education and perception also shape adoption. Most designers entering the industry are trained through pattern-first, screen-based workflows and have limited exposure to immersive design. As a result, like much of the industry, VR is often assumed to be unready, or suited only to experiencing content, even though those who try it are often surprised by how intuitive and natural it feels to actually work on forms and ideas in virtual space. 

All in all, younger designers tend to embrace VR more readily; established workflows, shaped by legacy tools and timelines, approach it more cautiously.

Which brings us to a conclusion: that the limitations holding VR adoption back in fashion are more systemic and less technological. As interoperability improves, garment-aware modelling evolves, and immersive design becomes more visible in education, VR will, I expect, move naturally from the margins of digital product creation into the upstream phases where it can have the greatest impact.

studioofnew

I’m obviously speaking as an advocate here, but I firmly believe that what VR offers is not a replacement for CAD but a complementary stage, one that restores spatial intuition to a workflow that has become increasingly screen-bound. It should empower designers to resolve silhouette, proportion and creative intent before technical decisions are required, reducing rework, strengthening the accuracy of the tools that follow, and also answering the well-worn question of what it means to “scale DPC” without mandating that everyone who touches a 3D asset must also be a 3D designer. 

VR can also connect teams in real time, encourage participation from non-technical collaborators, and keep creativity active throughout the process rather than compressing it into early sketches. The value here lies in a new kind of balance between creating digitally and building digital assets that can be the foundations for more end-to-end workflows – a blending where intuition and computation can coexist, where creativity and accuracy reinforce one another and where the human element of design, spatial thinking, collaboration, instinct, has space to thrive alongside technological precision, instead of working against it.

Exit mobile version