By using this site, you agree to our Privacy Policy and Terms of Service.
Accept
Absolute GeeksAbsolute Geeks
  • LATEST
    • TECH
    • GAMING
    • AUTOMOTIVE
    • QUICK READS
  • REVIEWS
    • SMARTPHONES
    • HEADPHONES
    • ACCESSORIES
    • LAPTOPS
    • SPEAKERS
    • TABLETS
    • WEARABLES
    • APPS
    • GAMING
    • TV & MOVIES
    • ━
    • ALL REVIEWS
  • PLAY
    • TV & MOVIES REVIEWS
    • THE LATEST
  • DECRYPT
    • GUIDES
    • OPINIONS
  • +
    • TMT LABS
    • WHO WE ARE
    • GET IN TOUCH
Reading: Spotify removes AI-generated song falsely attributed to late artist Blaze Foley
Share
Absolute GeeksAbsolute Geeks
  • LATEST
    • TECH
    • GAMING
    • AUTOMOTIVE
    • QUICK READS
  • REVIEWS
    • SMARTPHONES
    • HEADPHONES
    • ACCESSORIES
    • LAPTOPS
    • SPEAKERS
    • TABLETS
    • WEARABLES
    • APPS
    • GAMING
    • TV & MOVIES
    • ━
    • ALL REVIEWS
  • PLAY
    • TV & MOVIES REVIEWS
    • THE LATEST
  • DECRYPT
    • GUIDES
    • OPINIONS
  • +
    • TMT LABS
    • WHO WE ARE
    • GET IN TOUCH
Follow US

Spotify removes AI-generated song falsely attributed to late artist Blaze Foley

GEEK STAFF
GEEK STAFF
July 23, 2025

Spotify is once again facing scrutiny over the rise of AI-generated content after a song falsely attributed to late country artist Blaze Foley appeared on the platform. The track, titled “Together,” was uploaded as if it were an official posthumous release, complete with cover art, metadata, and copyright credits. Foley, however, died in 1989—36 years before this song appeared.

The song was swiftly removed after fans, along with Foley’s label Lost Art Records, flagged the release. The track was reportedly uploaded via SoundOn, a music distribution platform owned by TikTok. Another AI-generated song misattributed to the late Guy Clark, who passed away in 2016, was also removed in the same timeframe.

What sets these cases apart from the growing flood of AI-generated lo-fi or ambient music is the attribution. Unlike AI music published under fictitious artist names or clearly labeled experimental projects, these songs appeared in the verified discographies of real, deceased musicians. There was no indication that AI tools were involved, nor was there any mention of the uploads being tributes or simulations. This created the false impression that these were authentic, authorized releases.

Spotify confirmed the removal and emphasized that such content violates its deceptive content policies, which prohibit impersonation and misleading representations. The company added that it takes enforcement action against distributors that repeatedly fail to screen for such misuse—including permanent removal.

But the appearance of the track on a platform as large as Spotify raises concerns about the adequacy of existing verification and moderation systems. With tens of thousands of songs being uploaded daily, automated checks are a necessity. However, that same automation can become a loophole for content that meets technical requirements but crosses ethical or legal boundaries.

The issue also raises broader questions about authorship, digital identity, and consent in the era of generative AI. AI tools can now mimic voices, lyrics, and artistic styles with increasing realism. Without effective safeguards, it’s entirely possible to fabricate entire discographies for artists who are no longer alive—and whose estates may lack the means to police every new upload.

This isn’t a problem unique to Spotify. Apple Music and YouTube have also faced criticism for hosting deepfake content, while AI music generators like Suno and Udio continue to lower the barrier for creating passable musical imitations in seconds. The underlying technology isn’t the problem—it’s the misuse of that technology combined with platforms’ current lack of oversight.

Possible solutions include better identity verification for artist accounts, embedding metadata or watermarking in AI-generated audio, and more transparent labeling of synthetic content. However, these methods introduce friction to a process many platforms want to keep seamless for creators.

The Blaze Foley incident underscores the ethical tightrope streaming platforms must now walk. When AI is used responsibly, it can serve as a powerful creative tool. But when it’s used to impersonate artists—especially those no longer able to object—it becomes a matter of digital fraud, with consequences for credibility, compensation, and cultural integrity.

Share
What do you think?
Happy0
Sad0
Love0
Surprise0
Cry0
Angry0
Dead0

LATEST STORIES

Instagram adds safeguards to accounts featuring children amid growing scrutiny
TECH
Reolink Altas PT Ultra launches in UAE with 500-day battery and 4K smart surveillance
TECH
Pre-orders now open in the UAE for the Huawei Pura 80 Ultra
TECH
iOS 26 beta 4: everything Apple changed, tweaked, and brought back
TECH
Absolute GeeksAbsolute Geeks
Follow US
© 2014-2025 Absolute Geeks, a TMT Labs L.L.C-FZ media network - Privacy Policy
Ctrl+Alt+Del inbox boredom
Smart reads for sharp geeks - subscribe to our newsletter and stay updated

No spam, just RAM for your brain.
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?