Key Takeaways:

  • AI generated content is already moving through platform infrastructure without resistance. The Velvet Sundown, a fictional band created using AI was able to achieve hundreds of thousands of listens on Spotify showing that synthetic media doesn’t need to be convincing, it just needs to trigger engagement.   
  • Fashion might be approaching its Velvet Sundown moment. Amazon already replicates based on reviews, searches, and sales. As generative tools close the gap between prompt and product, fashion could easily adopt the same practice as streaming platforms – fast, “good enough” content that fills the feed, not because it’s better, but because it’s there.   
  • Cloudflare’s move to charge AI bots for scraping is one of the few efforts to reintroduce friction. By charging bots for access, Cloudflare is building a transactional layer where none existed before. The pilot programme offers a way to track and potentially recover lost value. This is a significant move away from protecting original work to building new economic structures around machine legibility. The fashion equivalent may be harder to realise, but the logic is there and the infrastructure is forming. Whether the incentives follow is still uncertain. 

Upcoming Webinar: A New Perspective with SWAROVSKI and Centric Software

On 11th July, The Interline will chair a live online event, featuring expert insights from SWAROVSKI and Centric Software – along with a live walkthrough of key use cases and examples where combining creative tools with live enterprise data can create a unifying source of visual truth that provides a whole new perspective on product development.

Join Alicia López (Global Process Manager & PLM Owner for SWAROVSKI), Dawid Oleskowski (Business Consultant for Centric Software) and The Interline’s own Ben Hanson to learn how a world-renowned brand like SWAROVSKI was able to streamline and transform its product development processes by intelligently aligning creativity with technology, and by uniting previously-disconnected data and workflows into a new framework of real-time decision-making.

AI music, compensation models, and what they mean for fashion

You don’t always see the moment something changes. Sometimes it just fades in, like a track you didn’t notice starting, but that’s already playing.

In many ways The Velvet Sundown drifted into Spotify that way. Moodboard artwork, a glowing Billboard review (that didn’t exist), and most pertinently of all, a smooth, ‘70s inspired alt-rock sound, that blends almost too well into the kind of background music best described as gloriously forgettable. 

So who is The Velvet Sundown? Until a week ago, they were, at least according to outward experiences, just a small Psych-rock band who had garnered over 500,000 monthly Spotify listeners on the back of two full length albums, both released in a single month. (Few artists are that prolific, of course, which was the first red flag.) As one X user pointed out, they accomplished all this despite seemingly having only existed for two weeks. If the breakneck speed of their album releases wasn’t suspicious enough, every image of the band across social media was clearly AI generated. 

the velvet sundown

Ever the opportunist sleuth, the internet at large began digging deeper, fingers were pointed, questions asked, and just this week it was revealed in an interview with Rolling Stone that The Velvet Sundown was indeed a fictional band, an “art hoax” as pseudonymous brand spokesperson Andrew Frelon put it: “It’s marketing. It’s trolling. People before, they didn’t care about what we did, and now suddenly, we’re talking to Rolling Stone, so it’s like, ‘Is that wrong?”

It’s an interesting question. One loaded with philosophical dilemmas, surrounding ownership, the act of creation, and if the human hand is indeed losing its grip on creative industries. As Frelon went on to say “Things that are fake have sometimes even more impact that things that are real”.

Which is all well and good for the purveyors of the fakes, but decidedly less so for the people who make up the market for original works.

Let’s start with what we know, and work towards the impact of it on industries at large. Whether an art hoax, prank, or AI test run, the fact remains that someone somewhere created some songs using AI (the extent of AI usage in their creation isn’t yet clear, but it would be surprising if it was anything less than “total,” outside of prompting and lyrics) then uploaded them to Spotify. This is where it gets interesting. The songs weren’t just left out to float; someone fed them into the system’s bloodstream, or some algorithmic or human entity seemingly put their finger on the scale. 

Music Ally’s reporting shows that the tracks were boosted by a tight cluster of anonymous Spotify curators, 25 of the top 30 playlist placements came from accounts with massive reach but minimal footprint. Essentially a ghost operation designed to trigger the algorithm and let momentum carry the rest. The band did not rise to half a million listens in a month by virtue of how “good” they were, but because someone, somewhere knew how to insert into the ecosystem in such a way, that the algorithm would take care of the rest. The numbers climbed, and the moment passed with fanfare. The music reached half a million monthly listeners before anyone stopped to look closely, and when they did, the tracks were already embedded, curated into both individual and machine-derived playlists, streaming in the background and folded into daily soundtracks.

And therein lies the issue: AI music, even if it was intended as an art piece or a statement, is now “good enough” that it simply disappears into the background. People who listen to music casually or passively will hear it and not realise, which is what opens the door for not just unscrupulous creators, but unscrupulous platform owners, to potentially drive the wedge deeper with no objection.

What truly makes the Velvet Sundown story interesting has nothing to do with the quality of the output, or even the truth behind it. It’s the fact that AI content is now being consumed both consciously and unconsciously. People are pressing play, and enough of them are doing it to give the illusion very real momentum. As the number of listens on Spotify grows, buoyed by an eager algorithm that cares little about taste but wholly about interest, what we get is a conclusion that looks a lot like this: AI music doesn’t need to be convincing, it just needs to be listenable enough, because that’s exactly what the system is built to reward. And that means this synthetic success story is now entirely replicable across almost all creative industries. 

Fashion is primed for a similar experience, in fact you could argue it’s already happening. 

Amazon has been in the “sit back, watch, and replicate” game for a while. Reading reviews, tracking search data, building its own private-label  based on signals. That process still relies in part on human teams, but the march of automation suggests that won’t remain the case forever, and generative tools are closing the  gap. A prompt becomes an image, that image becomes a spec, and the spec in turn becomes a product. It’s then dropped into a store front, with a generated (naturally) description, and the sales flow in. 

Again, just as with Spotify, the machinery is already in place: demand signals, fulfillment systems, algorithmic front ends. Production is still the missing link, but if the demand is there for microfactory fulfilment, then the model will scale.

In both music and fashion (all creative industries for that matter), the immediate threat isn’t about replacing artists or designers. That’s not something we expect to see anytime soon. It’s about filling the space around them. Flooding the feed with content that doesn’t need to be good, just fast, just good enough, but more importantly, just there. There is a reason, after all, that default positioning is so sought-after in search, and that pre-installed applications and pre-existing integrations carry such clout when it comes to consumer and enterprise software. 

Publishing is already starting to experience the next turn here. TechCrunch recently reported that referrals to news sites from ChatGPT jumped from under a million to 25 million in a year. Which would be an AI success story if the upshot was a net positive trend in traffic, but in reality organic search fell by more – at least in part because of Google’s blithe insistence on replacing search results with AI Overviews. The Wall Street Journal found that Business Insider lost over half of its organic search traffic between 2022 and 2025. The New York Times and Washington Post have also seen substantial decline in traffic from search. 

At The Interline, we’ve seen our own referral traffic from ChatGPT and other AI search products rise in the last twelve months to where it’s about to account for 5% of overall volume (though our organic search is also holding steady, which demonstrates that the rule is not universal). What we cannot attribute, right now, is occasions where our content is surfaced in AI responses that don’t lead to clicks, and where the end user gets the information they need from ChatGPT – with a citation – and moves on.

Given the wide reaching net of AI across all creative industries, the question we have to ask is, can this kind of scraping, extraction, and averaging be slowed down? Can (and should) platforms be made to feel the cost of what they absorb?

Cloudflare has started testing an answer this week. At the start of this month, it launched a live pilot: a marketplace where websites can charge AI bots for scraping their content. If a model wants to ingest your work, it has to pay. The rates for those payments may be miniscule, but the objective is larger: to establish a market for AI scraping, so that publishers, artists, musicians and other creatives can demonstrate revenue lost.

It’s a small step but a structured one, and it points to a bigger shift: monetisation through machine legibility instead of the traditional content itself. The interline falls on the side of ASI analysis that you  can’t put the genie back in the bottle, the models are already trained and the platforms already moving. But Cloudfare’s approach hints at a way to make the feedback loop benefit more entities, and to force a transactional layer back into the system. The bigger question is whether other industries like fashion and music will follow. 

So what might a fashion equivalent look like? One version might combine scraped design input with micropayments to the original creators (marketplace sellers, solo designers, brand archives). Pair that with generative interfaces and microfactory fulfilment, and the machinery starts to resemble a platform-native production stack.

Microfactories allow for short-run manufacturing, rapid iteration, and local fulfillment. When paired with generative design, you get a system where platforms can test, create, and ship products without ever involving a brand. Once that happens, origin might no longer matter, and the only real metric becomes performance: how often it plays, how quickly it sells, and how easily it fills space.

Do the incentives stack up? Probably not. In publishing, scraping royalties are unlikely to replace the value lost when direct visits or licensing disappear. The same dynamic could emerge in fashion: being part of a model’s training data might feel like earning reach and recognition, but without depth of return, it’s hard to call it value – and certainly not sufficient value to justify time spent by itself.

None of this means human creators are obsolete, but it does mean their work is part of an increasingly  machine-dominated system. Not one where machines are responsible solely for determining what percolates to the top of the content or product pile, but where they also become active participants in creating those works to begin with.

Which brings us back to The Velvet Sundown. It wasn’t real. It wasn’t fake. It was functional. It filled the gap, and the system welcomed it.  And while it’s easy to deride the machinery of popular music as mulching everything down to the lowest common denominator and serving it back up… how much of mass market fashion can we really say is different?