There is No Such Thing as a Safe Haven From AI Slop, Not Even Spotify

The first clue something was off came when a new album from the band HEALTH appeared in my Spotify new release radar. But the cover seemed strange—it didn’t look anything like a HEALTH album. Clicking on it confirmed my suspicion: this wasn’t HEALTH’s music. Instead, it was one of three fake albums that showed up under the band’s name over that weekend. HEALTH’s social media made light of the situation, and eventually, the albums were removed from Spotify. I went back to my routine until the next weekend when a new album by the artist Annie popped up. Since Annie had recently released a single titled “The Sky Is Blue,” I assumed this was legitimate. But as I played the album, all I heard were birds chirping over generic, New Age instrumentals—not at all what I expected from Annie.

It turned out this wasn’t an isolated incident. In my group chat, friends started sharing similar stories of encountering bogus releases under the names of various artists. Some of these had been metalcore bands, like Caliban and Silent Planet, and others were bands or solo acts with single-word names, including Swans, Asia, Standards, and Gong. In each case, a mysterious album would appear on their Spotify page, often without a single song that sounded like the artist’s usual style. In some cases, the fake albums vanished after a few days; other times, they stuck around for weeks or longer, often frustrating the artists affected by the issue.

Standards’ guitarist Marcos Mena described how strange it was to see a fake album under their verified page. Thinking Spotify would handle it quickly, Mena contacted them, but instead of an immediate resolution, he received a reply weeks later stating that the album had been “mapped correctly” to the artist’s page. By then, the fake Standards album had lingered on Spotify for nearly two months, adding confusion to the band’s profile and distracting from the group’s actual new release. For Mena and others, the situation raised an obvious question: why was this happening?

Can you spot the fake 1
There is No Such Thing as a Safe Haven From AI Slop, Not Even Spotify 2

Money seems to be the answer. Spotify doesn’t operate like social media platforms where artists can upload content directly to their pages. Instead, artists work with music distributors who handle the logistics, including licensing, metadata, and royalty payments. These distributors send songs and their metadata (such as song title, artist name, songwriter, and label information) in bulk to Spotify and other streaming platforms. Since this setup relies heavily on the accuracy of the metadata and the integrity of the distributor, Spotify essentially trusts that all information it receives is accurate. Unfortunately, this trust can be exploited.

In the case of Standards, the fake album was released under a different distributor than the band’s actual label, which means any revenue from streams of the fake album would flow not to Standards but to the unknown distributor—identified as Gupta Music. This company, as it turns out, appears to have uploaded hundreds of similar albums, each featuring suspiciously AI-generated cover art and nonsensical band names like “Rany,” “Living,” or “Bedroom.”

Spotify eventually removed the fake content, citing repeated violations of its Metadata Style Guide and severing ties with the distributor responsible. A spokesperson for Spotify noted that the company invests substantially in both automated and manual review processes to prevent royalty fraud. However, the frequency of such incidents suggests that more comprehensive solutions may be necessary.

These scams are not new to Spotify, or even unique to music. Streaming scams involving bots and AI-generated music date back years. In one high-profile case, a Danish man was convicted for using bots to stream songs and collect royalties fraudulently, amassing $300,000 before being caught. In another instance, Michael Smith allegedly generated $10 million in royalties by using bots to play AI-generated songs he uploaded to streaming platforms under fake artist names. These examples underscore how streaming platforms’ payout structures can be vulnerable to exploitation, especially when combined with the ease of creating AI-generated music.

Universal Music Group (UMG) recently sued Believe, a music distributor, and its subsidiary TuneCore, alleging that the companies knowingly distribute music uploaded by fraudsters. This lawsuit highlights one common tactic: slightly altering well-known artists’ names to capture streams from fans who may misspell names when searching for music. Fake artists like “Kendrik Laamar” and “Llady Gaga” populate playlists and capture streams that would otherwise go to real artists, who ultimately lose out on earnings.

For Spotify, the process of validating content is challenging, but not impossible. In cases like Standards, for example, Spotify could flag unusual metadata changes, such as when an album appears under a different label than usual. While Spotify has some internal tools to identify music that doesn’t fit an artist’s style, many of these tools rely on pattern recognition and manual oversight. Streaming platforms are increasingly burdened by this surge in AI-generated content, which threatens to drown out legitimate artists’ work, much like the impact of AI-generated content on platforms like Facebook or Twitter.

As technology advances, AI continues to accelerate and streamline these scams, but Spotify and its competitors may need to invest in more robust solutions to protect artists and listeners alike. Banning all AI-generated music might seem tempting, but even this would be difficult to implement given that many artists are beginning to experiment with AI in their music creation process. Instead, industry observers suggest that a balance between automated and manual content moderation may be key, particularly if lawsuits like UMG’s continue to emphasize the financial and reputational costs of not addressing fraudulent uploads.

Whether these preventive measures become standard remains to be seen. Meanwhile, as incidents like these grow more common, artists and music lovers will have to navigate an increasingly murky landscape—one where the thrill of discovering new music could easily lead to a dead end filled with AI-generated clutter instead.

Latest articles