In a disturbing twist of technological overreach, AI-generated music and different tracks are actually being launched underneath the names of deceased artists—with out permission from their estates—and distributed on main platforms like Spotify and TikTok. These fakes seem alongside authentic catalogs, deceptive followers and tarnishing creative legacies. The deception just isn’t hypothetical. It’s already taking place.
Stories have surfaced of AI-generated songs printed underneath the names of artists like Blaze Foley and Man Clark, apparently uploaded through TikTok’s SoundOn platform and pushed to Spotify’s verified artist pages. The result’s a digital impersonation—worthwhile for the uploader, invisible to most followers, and deeply disrespectful to the reminiscence of the artist.
And worse: the estates of those artists have been by no means consulted. They weren’t even notified.
Spotify and TikTok Are Not Harmless Bystanders
This isn’t only a glitch within the system. It’s a product of deliberate platform design.
Spotify evidently depends solely on “trusted” distributors like SoundOn to offer metadata—however fails to confirm whether or not the artist is even alive or whether or not the rights are authentic. When somebody uploads a pretend observe, Spotify’s system doesn’t flag it. It routes it straight to the artist’s verified profile—proper subsequent to their actual work. That’s not a impartial mistake. That’s platform-enabled impersonation.
TikTok, via its SoundOn pipeline, makes it simple to add and monetize content material with out verifying possession. That’s how these AI fakes are getting in. Why is the safety so dangerous? You don’t assume it’s about the cash do you?
And let’s be sincere: this whole setup is presumably worthwhile for Spotify and TikTok. If it weren’t, why would they proceed working this fashion? Each pretend stream nonetheless generates Spotify’s well-known 30% vig. Each fraudulent add nonetheless drives engagement. Each helps gasoline the algorithm, creating extra exercise, extra suggestions, and extra clicks. The platforms profit from the fraud, even when the true artists don’t.
To make issues worse, the estates usually haven’t any manner of figuring out that is taking place–as a result of if Spotify notified them, then Spotify could be denied the income alternative. Except a member of the family or property consultant occurs to keep up a Spotify for Artists account—and occurs to test it—they could by no means understand that fraudulent new tracks are showing underneath the artist’s official title. There’s no alert. No approval request. No safeguard.
This isn’t a system that may be trusted.
Apple Proves This Is a Selection, Not a Limitation
In contrast to Spotify or TikTok, Apple Music appears to have averted these impersonation scandals—probably as a result of it really works solely with a curated group of distributors who’re required to confirm uploader id and rights. That fundamental diligence makes a world of distinction. It proves that this isn’t a technical drawback—it’s a enterprise resolution.
Apple protects artists. Spotify and TikTok revenue off them—even the lifeless ones.
Not Simply About Royalties—About Fraud
The estates harmed by these impersonations aren’t asking for a royalty test. They’re asking for the fraud to cease.
And so they’re proper to. If Spotify and TikTok permit their platforms for use for impersonating lifeless artists, with full data of the sample, it raises severe questions on their duty and intent. This isn’t merely a case of failing to average content material—it might rise to the extent of willful blindness, an idea acknowledged in regulation when corporations intentionally keep away from studying the reality.
Whereas Spotify might take down the tracks if the property notifies them, or extra probably has an precise title of somebody at Spotify to name–sorry, electronic mail–there’s no DMCA for trademark or proper of publicity.
AI Is Making It Worse
As generative AI platforms proceed scraping artistic works—with out consent or compensation—there’s a rising threat that these pretend tracks will flood the web, undermining each artists and shopper belief. What does it imply to record a observe underneath an artist’s title if the platform itself can’t vouch for its authenticity?
If these abuses aren’t stopped now, AI-generated fakes might turn into a everlasting fixture of the music ecosystem—particularly for artists who can not converse for themselves.
A Name to the FTC: Examine and Implement
That is the place the Federal Commerce Fee (FTC) should step in. The company has a mandate to guard the general public from unfair and misleading practices—and this qualifies on each counts.
The FTC ought to examine:
– Whether or not Spotify or TikTok knowingly facilitated impersonations;
– Who profited from the AI-generated fakes;
– Why safeguards like Apple’s haven’t been adopted;
– Whether or not these practices violate shopper safety regulation or represent misleading enterprise conduct.
The FTC has the facility to require adjustments to distribution and verification processes—and to set a precedent that digital impersonation of artists, dwelling or lifeless, is not going to be tolerated.
Cultural Legacy Is on the Line
This challenge goes past music. It speaks to the way forward for digital id, consent, and belief within the age of generative AI. Deceased artists can’t defend themselves. Their estates are sometimes the final line of protection. However they shouldn’t need to struggle platform by platform, fraud by fraud in a complete new recreation of whack-a-mole just because the dominant platform fails to police its techniques.
And so they definitely shouldn’t be anticipated to patrol Spotify themselves, hoping to catch unauthorized uploads earlier than followers or press uncover them.
It’s time for the FTC to behave—and to clarify that the legacies of America’s artists usually are not up for exploitation.