To better understand what might be coming next for behind-the-scenes technology in beauty, and how effectively the philosophies and the principles of DPC can transfer from one sector to another, we commissioned makeup artist Jack Oliver, who also sits on the British Beauty Council, to put physical skills into practice using new DPC tools dedicated to the beauty sector for our DPC Report beauty cover.
Below, we quiz Jack on their inspirations, their workflows, and their pipeline to provide behind the scenes insights into one of our two cover designs this year. We also spoke to Dr. Alex Box, another member of the British Beauty Council, and one of the Founders of V-Metics, the virtual cosmetics platform Jack used, along with Tony Lacey from Epic Games, whose MetaHuman platform provided the unifying digital avatar layer across both our covers.
Candidly: The Interline doesn’t know if DPC for beauty has the same long-term potential that it’s demonstrated for fashion. As you’ll learn from Jack’s answers, digital tools clearly provide a new outlet and an uplift for seasoned and emerging makeup artists, and the conditions for major cosmetic brands to achieve similar results to the ones we’ve seen in fashion are present, but there are no guarantees.
We do know, though, that beauty technology is on a marked upward trajectory in both B2C and B2B use cases, and on that basis we believe that DPC deserves the same opportunity for discussion, exposure, and development in beauty that it’s had in fashion.
The Interline: Jack, walk us through your career as a makeup artist. Where did you start and what do you see as being the milestones that have defined your skill progression and your personal style?
Jack Oliver: From a young age I was introduced to the world of difference-making and expressiveness in self image. I first really took notice of makeup being a creative space thanks to the people I grew up with from the alternative community, who would wear expressive makeup and make dramatic changes to their appearances to reflect who they were.
As a young artist, this opened up a world for me where I sought a new way to express my creativity and my personality. In terms of media, I played with art, photography, and graphic design during my education, but I then moved into higher education wanting to learn makeup specifically.
At that point, I started working in the retail side of makeup, and I had the opportunity to hone my craft as a makeup artist. To me, that meant learning about every face that I had the opportunity to use as a canvas, and developing my work to the point where I was able to achieve what I wanted physically, which then lead me into wanting to see how to work with the same aims and the same materials digitally, or phygitally, where the digital and the physical makeup reflect one another exactly.
The Interline: What was your first introduction to working on makeup digitally? And how did the tools available to you then differ from where they are now, maturity and capability-wise?
Jack Oliver: V-Metics, which is the platform I used to create this makeup, was my first experience of working digitally in the sense of the canvas being a 3D face and a digital twin of the materials. Prior to this, I’d learned Photoshop as part of a photography qualification, and I’d had an experience I suspect a lot of people reading this report have had: designing an avatar in a videogame character creator, which in my case was The Sims.
It was actually The Sims that brought me here. Not too long ago, I was taking some downtime between projects, and I found myself missing the playfulness of makeup, so I’d started creating makeup selections on Sims (the generic name for the characters from the game) and posting them on social media as a bit of an experiment. That lead to Dr. Alex Box, one of the founders of V-Metics, reaching out to me and inviting me to go hands-on with a closed beta.
That gave me the chance to take some of the same playfulness I’d found in videogames, and to pair it with the real craft, precision, and artistry of physical makeup – using the tools I was familiar with in the real world to create digital looks. From there, I’ve worked on learning the tools, developing my skills, and also being an advocate for the way that digital product creation in beauty and cosmetics can help both preserve and protect the artists’ touch, and also give those same artists a new way to connect physical and digital spaces, to learn, and to experiment.
The Interline: Needless to say, given that this is the fourth digital product creation report, fashion is now at the point where brands, suppliers, and consumers all regularly interact with digital representations of materials and products. Those different touchpoints may not always share a common foundation, which is one of the challenges of pursuing end-to-end 3D in apparel, but in discrete areas clothing has come a long way towards having 3D be the reference frame for a lot of creative work and commercial decision-making. Dr. Alex, why do you think that’s not been the case for beauty?
Dr. Alex Box: They might exist side-by-side in retail, but fashion and beauty are very different industries in terms of what they are actually selling. The way I look at it, fashion is sculpture while beauty is painting – and fashion creates products, while beauty creates promises.
Any digital tools that want to be adopted in beauty need to recognise that distinction, and frankly the reason that the beauty industry hasn’t adopted a 3D framework is because the software hasn’t been able to measure up to representing real products and materials, or to the high bar needed to allow artists and product developers to create, pre-visualise, and refine looks and formulations, virtually.
In fashion, 3D pattern and form-making software hinges on a lot of the same principles that 3D CAD does in rigid, hard-body automotive and engineering design. Size, soft materials, and fit are really the only variables that change in that equation.
Beauty is a different proposition in the physical world, so it also needs to be treated the same way in the digital one. While fashion can obviously be styled and worn in artistic and unique ways, formula-based beauty products are really only actually realised when they undergo a process of highly nuanced, individual interpretation and application in the hands of the artist and the consumer.
That latter part really is key, too: the beauty industry has focused heavily on B2C technology, because the final expression of a beauty product happens in studios and in customers’ homes. Now, I believe the industry is waking up to the B2B applications as well, for all the reasons Jack has already mentioned.
The Interline: What was your inspiration and your intent behind the makeup you created for this first-ever beauty cover for a DPC Report? What were you aiming to express, and how did you go about it?
Jack Oliver: I really wanted to explore colour and texture. For the MetaHuman model I picked, Kioko, the goal was to appreciate and showcase the skin, with all its tone and textures. I picked the colours – muted mustards and graphic golds – to complement the model, and I worked on shapes and placements that would bring light to where it was needed, and would also make a striking statement that captured a feeling of happiness and reflection.
“It was important to me to make sure that the fine details of Kioko’s skin were visible, because I think we yearn for authenticity and imperfections in digital representations more than we do in the real world.”
JACK OLIVER
The key thing, for me, was to really test how effectively I could use digital tools to reflect the entire makeup journey of capturing a moment in time, and then bringing in elements and materials that can really evolve what they express through changes in lighting.
The Interline: What materials did you work with to create the final look? And how confident are you that you could replicate this digital creation with the same physical palette?
Jack Oliver: I tried to select materials that pushed boundaries, and that felt exciting and experimental – both metallics and mattes. For the eyes, I started by crafting a soft base with the perfect shades and tones, and then the overlaid geometric lines by using different digital brush tools and palettes. After this, I moved onto the skin and lips, where I wanted to create a very soft, demi-matte finish that would contrast with the highlighted gold in the eyes.
It was also important to me to make sure that the fine details of Kioko’s skin were visible, because I think we yearn for authenticity and imperfections in digital representations more than we do in the real world. In the physical world we pluck brows and remove baby hairs, but working digitally it felt right to highlight these instead, and to adapt the makeup and create around them.
Although I’ve chosen to spotlight some different elements digitally than I would with a physical makeup, I think it’d be exciting to recreate this look physically. It would involve the same materials I’ve worked with virtually: laying down matte mustard tones in a dry-down cream, blending the edges with soft tonal shadows, and then using a liquid gold graphic to create the exaggerated liner. I’d also still want to show real skin through the makeup, so I’d work with demi-matte finishes and soft matte lipliner to create a full lip look that’s then blended out into the skin.
The Interline: Tell us what your technical pipeline for this project looked like, from initial idea through to finished renders.
Jack Oliver: As I mentioned, after working on a brief together with the team at The Interline (this being the first ever beauty cover for one of these reports!), I started work in V-Metics. At that early stage, when you’re developing ideas, there’s a lot of value in being able to move back and forth through timeframes, and to easily start over – all without the sunk time and the waste of trialling makeup physically, or even on a paper face chart.
At the same time I was experimenting with the makeup, I was also able to change models until I landed on using Kioko, and was able to quickly develop a new direction that highlighted her skin tone, eye shape, and colour.
The final renders were run within V-Metics.
The Interline: Walk us through your physical setup as well. What local hardware were you running? What input device(s) did you use?
Jack Oliver: I currently use V-Metics as a Pixel Streaming application, so I’m able to create wherever there’s an internet connection, but without needing a lot of local processing power.
I have a graphic tablet and pen that I’m able to use the same way I would use a real makeup brush, in real-time. So provided I carry that and my MacBook with me, I’m able to keep working on projects without being tethered to a studio. I even put some finishing touches to this look while I was visiting family around the holidays.
This is an area where digital has a clear edge over physical. As well as not needing to be in the same physical space as a model, I also don’t need to open up a makeup kit, or make sure that I have the right physical materials to hand.
It’s important to me that working digitally truly reflects my movements, my intent, and my capabilities. But it’s also incredibly freeing to be able to pair that accuracy and craft with the flexibility to work almost anywhere, with no overheads besides the hardware that I already owned.
The Interline: As well as representing the cosmetic materials, any digital platform or pipeline that’s going to meet the high bar for actually being a viable place to test, develop, and refine not just individual looks but products and collections needs to incorporate a robust visual representation and physical simulation of the skin. Are we at the stage where you think this is ready? As an artist, are you confident working digitally?
Jack Oliver: As an artist who’s been doing makeup physically for many years on a freelance basis, I do feel as though working digitally is something I could make a part of my everyday creating. Despite being grounded in realistic skin and materials, the digital canvas feels more like experimentation, or gameplay – a platform I can approach with a clear mind, and then interact with, applying products virtually, and generating new ideas.
On that basis, having access to digital creation tools for beauty has already helped to level up my design and application skills, and I can see plenty of potential in the future where makeup artists like me turn to digital as their first port of call, and as a way to develop their art and their careers.
“Having access to digital creation tools for beauty has already helped to level up my design and application skills, and I can see plenty of potential in the future where makeup artists like me turn to digital as their first port of call.”
JACK OLIVER
I mentioned the experiments I did designing characters in The Sims earlier, but I want to emphasise that, while digital creation in makeup has the feel of “gameplay” to it (and while I’m sure games will be where some of the younger generation finds their own passion for makeup) I see it as a serious tool that can play a role in makeup artists’ skill development, and that can help preserve the craft, as well as giving makeup designers new opportunities to practice it.
The Interline: Tony, when it comes to deploying digital humans, fidelity has always been a primary barrier. Whether we’re talking about consumer-facing scenarios, or we’re aiming to put virtual faces or bodies in the hands of experienced artists and engineers, the limiting factor is how effectively digital can represent physical – both as a self-contained representation, and as a canvas for creative and technical work.
We’ve seen this tension lead to innovation in soft body avatars, where fit and athletic performance in garments are concerned, but the bar is arguably set even higher for cosmetics and beauty, which demand both an extremely granular understanding of the human face, and a reliably accurate surface that makeup artists can apply complex materials to, with confidence that the results will mirror the real world.
How far have we come towards that vision, with MetaHuman, and what, technically speaking, is going on behind the simultaneous pursuit of photoreal final pixels and uncompromising fidelity in skin and materials?
Tony Lacey: MetaHuman has already brought us much closer to faithfully representing real humans through final pixels that deliver richly expressive and emotionally credible characters. That comes from grounding the entire system (geometry, materials, animation, shading, etc) in high-fidelity scan data and physically based rendering. The engine isn’t just drawing a face; it’s simulating how light interacts with layered skin, micro detail, and expressive movement, so that creators can treat the surface with the same confidence they would in the real world. This is why makeup, complexion work, and other beauty-specific materials now behave predictably across a wide range of faces.
But our ambition for MetaHuman goes much further than reproducing the physical world. Yes, one of our goals is to represent real people in a way that feels authentic and believable, but the broader vision is to let anyone create any character they can imagine. MetaHuman is evolving toward a system where stylisation, identity, and self-expression are all first-class citizens, not edge cases. Ultimately, we want everyone to represent themselves digitally in the way they feel inside, not just how they appear externally, and we want digital makeup and beauty workflows to come on that journey with us; supporting realism when needed, but equally enabling creative freedom, transformation, and self-authored digital identity.
The Interline: Dr. Alex, what does the onramp for new makeup artists, and the career development opportunities for existing ones, look like as we go into 2026? What in-points are there to learn the craft, either from entry-level opportunities, or from existing subject matter experts and icons? In fashion, we’ve seen 3D make significant inroads into education at least partly because of industry demand, but it feels as though perhaps there’s a “chicken and egg” situation here, where digital product creation needs to take off in-industry for it to be taught at an institutional level, and vice versa.
Dr. Alex Box: This is an area where I agree that beauty has a lot to learn from fashion, as well as where fashion’s uptake of DPC tools is creating new career development opportunities for aspiring and professional makeup artists.
As we go into 2026, digital product creation pipelines for garments and footwear are getting more mature, and we’re starting to see much more in the way of real-time engines being adopted across design, production, and content creation pipelines. Now that those ecosystems are becoming better-connected, we’re also seeing major brands in luxury (LVMH for example) wanting to treat them as holistic, complete workflows that include apparel, styling, makeup and hair – all being represented digitally.
This gives makeup artists a new entry point to the future of digital product creation and content creation, as well as providing them with transferable skills to apply their craft in cross-disciplinary ways, across digital marketing, gaming, and film.
At the moment, there’s a broad selection of new elective courses, as well as greater integration into mainstream curriculums, covering digital fashion, and we clearly need to see equivalents in beauty.
Right now, that education and upskilling pipeline is held back by limitations in the hands-on tools and ecosystem for digital beauty. This, I believe, is going to change quickly as a result of demand, and as we can see from the way Jack has been able to translate real-world skill into digital platforms, and vice versa, there are makeup artists already hungry to find new ways to practice their craft that offer a non-destructive means of moving from digital to physical and back again.
I expect that we’re going to see digital product creation become a much larger fixture of beauty strategies in the very near future, so having that pool of hybrid talent (and the tooling to allow them to work digitally, natively) is going to become critical very soon.
