← Displaced

Investigation

AI Is Impersonating Musicians on Spotify. It Is Stealing Their Identity and Their Income.

Published April 11, 2026  ·  8 min read

In January, a British indie folk singer-songwriter named Ormella released a live EP recorded in her living room. She had been deliberate about it — she wanted to make something that was visibly, unmistakably human. No AI. No generated elements. Just her, performing.

A few days after the release, messages started arriving from fans. Was the new song hers? It did not sound like her. She checked her Spotify profile and found a track she had never recorded, under her own name, on her own artist page, being recommended to her own listeners.

Someone had used AI to generate a song and uploaded it directly to her profile. She had never recorded it. She had never heard it.

This is not an isolated incident. It is a pattern, and the scale of it is becoming impossible to ignore.

Sony Music has requested the removal of more than 135,000 AI-generated songs impersonating its artists from streaming platforms in recent months. Deezer reported that 50,000 AI-generated tracks are being uploaded to its platform every single day, accounting for 34% of all new music. Spotify removed over 75 million spammy tracks in the twelve months to September 2025 — a period marked, as the platform noted, by the explosion of generative AI tools. 75 million. In one year.

The fraudsters behind these uploads are not, in most cases, trying to deceive dedicated fans. They are running an arbitrage operation. Upload enough AI-generated tracks under enough real artists’ names, and some fraction of them will accumulate streams before anyone notices. At fractions of a cent per stream, even a thousand plays is worth a few dollars. Multiply that across thousands of artists and tens of thousands of tracks, and the economics become meaningful — all of it extracted from the streaming ecosystem that real musicians depend on.

The targets are chosen with a specific logic.

Scammers have focused particularly on low- and mid-level artists with devoted followings — people with enough listeners that an upload will get clicks, but not enough profile that their accounts are closely monitored or quickly defended. They have also targeted dormant artists whose return to music might spark genuine interest. AI-generated songs have been uploaded to the profiles of SOPHIE, the electronic music producer who died in 2021, and Uncle Tupelo, the 1990s band whose members went on to form Wilco. A fake version of King Gizzard and the Lizard Wizard appeared under the name “King Lizard Wizard” — with copied song titles and lyrics — and was briefly recommended by Spotify’s own Release Radar playlist before being taken down.

British artist Benedict Cork described watching an AI imitation of his own work spread across streaming platforms within a week of posting a clip of himself playing a new track on social media. Someone had heard the snippet, fed it to a generator, and uploaded the result before his actual recording was even finished.

“At first I found it really funny,” he said. “Then I was impressed by how amazing the technology is. And then I became a little more angry.”

That arc — amusement, then wonder, then anger — is becoming familiar.

In late March, Spotify launched Artist Profile Protection, a new opt-in feature that allows artists to review and approve releases before they appear on their profile. When enabled, any music delivered to Spotify under an artist’s name triggers an email notification. The artist can approve it or decline it. If they do not respond, it is blocked by default.

It is a meaningful step. It is also, as a response to the scale of the problem, somewhat like installing a lock on one door of a house that has no walls.

Spotify is one platform. The Artist Profile Protection feature is opt-in, meaning artists who do not know about it — which includes most independent musicians without active management — are not protected. And declining a release on Spotify does not prevent it from going live on Apple Music, Tidal, Amazon Music, or any of the dozens of smaller streaming services where the same tracks can accumulate streams. TechCrunch noted that Spotify’s own announcement acknowledged this directly: artists who decline a release on Spotify should still contact their label or distributor because the music may appear elsewhere regardless.

What makes this story different from the other AI displacement stories on this site is the specific nature of the harm.

When a translator loses freelance work because clients have switched to AI tools, what is lost is income and livelihood — real and serious, but abstract at the level of individual identity. When a lawyer finds that junior associates are no longer being hired because AI does their document work, what is lost is a career pipeline — also real, but diffuse.

What is happening to musicians on Spotify is something more visceral. It is identity theft. Someone is putting words in their mouths — or rather, notes under their names — and distributing those notes to their listeners, through their own profiles, generating revenue that goes elsewhere. The harm is financial, but it is also reputational, emotional, and deeply personal in a way that most forms of AI disruption are not.

A musician’s catalogue is not just their income stream. It is their artistic record. The songs under their name, on a major streaming platform, are a form of public statement about who they are and what they have made. An AI-generated fake in that catalogue is not just theft. It is a kind of forgery.

The deeper problem, which the Spotify feature does not solve, is structural.

The music distribution ecosystem was built on openness. Third-party services like DistroKid and TuneCore — which handle the uploading of music to streaming platforms on behalf of independent artists — lowered the barrier to entry for independent musicians in a way that genuinely democratised access to global distribution. The same infrastructure that made it possible for an independent artist in Leeds to get her music onto Spotify with minimal cost or friction also made it possible for a fraudster with a laptop and an AI generator to upload fake songs to her profile.

Fixing that infrastructure requires changes that will slow down legitimate distribution too. Every additional verification step that prevents a fraudster from uploading AI slop is also an additional step that an independent artist has to navigate to release real music. The platforms are trying to thread that needle, and Spotify’s Artist Profile Protection is a genuine attempt to do so. But the needle is very fine, and the fraudsters only have to find one gap.

Musicians are not the first profession on this site to find themselves navigating AI displacement from an unexpected direction. Translators did not expect the market to collapse as fast as it did. The people building AI are being replaced by the AI they built. In each case, the disruption arrived through a mechanism that was not quite what anyone had predicted — not a dramatic replacement, but a quieter erosion of something that had seemed secure.

For musicians, the erosion is happening on two fronts simultaneously. The first is economic: AI-generated music competing for streams, playlist placements, and listener attention, produced at a cost that human musicians cannot match. The second is existential: AI impersonators uploading fake music under real names, corrupting the artistic record that musicians spend careers building.

Ormella said she was glad Spotify had taken the step of introducing Artist Profile Protection. She is right to be glad. It helps. It does not solve the problem on the other platforms. And it does not address the broader question of what the music industry looks like in five years, when the ratio of AI-generated to human-generated music on streaming platforms has shifted further in a direction that is already moving fast.

75 million tracks removed in one year. 50,000 AI uploads per day on Deezer alone. 135,000 takedown requests from one major label.

The fans who messaged Ormella asking whether the song was really hers were doing something important, even if they did not know it. They were paying attention. They were asking the question that streaming algorithms cannot ask and platforms cannot require: is this actually human?

That question is going to matter more, not less, as this continues.

Check your creative field

Where does your role actually sit?

Enter your occupation and see your AI automation risk score, based on Oxford Martin School + Anthropic Economic Index — 758 occupations.

Check my job risk →

Sources

Time — AI slop is threatening musicians (March 26, 2026)TechCrunch — Spotify tests Artist Profile Protection (March 24, 2026)Music Business Worldwide — Spotify lets artists vet releases (March 2026)Spotify Newsroom — Spotify strengthens AI protections (September 2025)Digital Music News — Artists fight back against AI slop (March 29, 2026)

Related

My Sister Was a Translator for 15 Years. AI Took Her Work in Months. →Even the People Building AI Are Being Replaced By It →AI and the Creative Industries →Will AI Replace Designers? →