Spotify Under Fire for Hosting AI-Created Songs by Dead Artists Without Permission
In a shocking twist that has rocked the music industry, Spotify is facing severe backlash for allegedly releasing AI-generated music under the names of deceased artists—without approval from their estates or record labels. The incident highlights the growing ethical and legal challenges brought on by artificial intelligence in the entertainment space.
AI Composed Tracks Emerge on Official Profiles of Dead Artists
An investigation by 404 Media uncovered a startling revelation: Spotify allegedly uploaded AI-composed songs under legitimate artist profiles, creating the illusion that these were posthumous releases. One of the most controversial instances is a song titled “Together” that appeared on the official account of Blake Foley, a country artist who was tragically murdered in 1989.
Not only does the song bear no resemblance to Foley’s style or voice, but it is also accompanied by a photo of a young, blonde male who is clearly not Foley. Tracing the origins of the upload, journalists linked the track to a Spotify-affiliated profile called “Syntax Error”, notorious for disseminating suspicious or fake music releases.
Multiple Artists Affected: The Rise of Unauthorized AI Music
This isn’t an isolated case. According to The Next Web, another fraudulent song titled “Happened To You” was published under the name of Grammy-winning artist Guy Clark, who passed away in 2016. Like Foley’s case, the track was traced back to Syntax Error, raising serious questions about Spotify’s oversight and vetting process for uploaded content.
After the revelations, Spotify swiftly removed the tracks, but critics argue the damage had already been done. These weren’t innocent mistakes — they were systematic abuses of AI technology and artist identities. The very fabric of trust between streaming platforms and listeners is now being questioned.
AI Bands Are Going Viral — Without Disclosure
Meanwhile, AI-generated music continues to flood Spotify, often without any indication that it wasn’t created by human artists. One notable example is “Velvet Sundown”, an AI band whose track “Dust on the Wind” went viral and amassed over 2 million streams since its June 20 release.
The song bears uncanny similarity to the 1977 hit by Kansas, yet nowhere in Spotify’s interface is it labeled as AI-generated. While Velvet Sundown now describes itself as a “synthetic music project,” most listeners remain unaware of its non-human origin.
This lack of transparency is deeply concerning, particularly as Spotify’s CEO Daniel Ek has expressed public support for allowing AI-produced content — except in cases of direct plagiarism. However, critics say Spotify’s current detection and moderation systems are failing to uphold even those standards.
The Music Industry’s Fierce Response
Unsurprisingly, industry leaders are pushing back. According to The Guardian, Sophie Jones, Chief Strategy Officer of the British Phonographic Industry (BPI), has vocally criticized tech platforms that train AI models on copyrighted music without permission or compensation.
She warns that such practices undermine the value of human creativity and threaten the financial livelihoods of real artists. Tech companies, Jones argues, are unfairly empowering synthetic acts to compete head-to-head with genuine musicians — often using the very work of those musicians to do it.
This sentiment is echoed across the industry, with many demanding that AI-created content be clearly labeled on all streaming platforms. Some services are already taking action. For example, Deezer has developed an algorithm that can detect songs created using AI software like Suno and Udio, allowing the platform to flag or remove them as needed.
Why This Matters: More Than Just Copyright
At its core, this controversy isn’t just about copyright infringement — it’s about credibility, deception, and digital ethics. When AI-generated songs are published under the names of real (and often deceased) artists, it misleads listeners, erodes trust, and disrespects legacies.
It’s also a matter of fair compensation. AI doesn’t need to earn a living — human artists do. Every fake stream directed toward a synthetic song is a lost opportunity for a real creator trying to support themselves through their craft.
What Needs to Happen Next
If streaming platforms like Spotify want to maintain credibility and trust, they must take immediate steps to:
- Label all AI-generated content clearly and visibly
- Ensure songs are only uploaded with proper permission from artists or estates
- Implement robust detection systems to stop abuses before they go live
- Create transparent content policies around AI music and synthetic identities
The music industry is at a crossroads. AI can be a powerful creative tool, but without ethical guidelines and strict enforcement, it risks becoming a force for exploitation and misinformation.