LeBron James targets AI company over bizarre deepfake pregnancy videos

Jul 26, 2025 - 18:01
 0
LeBron James targets AI company over bizarre deepfake pregnancy videos
LeBron James and Zhuri James attend the 2025 NBA Summer League game between the LA Clippers and the Los Angeles Lakers

NBA superstar LeBron James has become one of the first major celebrities to push back against the unauthorized use of his likeness in AI-generated content. James' legal team recently issued a cease-and-desist letter to FlickUp, the company behind the AI image-generation tool Interlink AI.

According to a report from 404 Media, FlickUp disclosed the legal action to members of its Discord community in late June. The Interlink AI tool, hosted on the server, allowed users to create AI-generated videos of high-profile NBA players, including James, Stephen Curry, Nikola Jokić, and others. While many of the videos were harmless, some crossed the line into disturbing territory, like a prominent image of the Los Angeles Laker embracing his pregnant belly.

One of the most widely viewed videos created with Interlink AI depicted an AI-generated Sean "Diddy" Combs sexually assaulting Curry in a prison setting, while James appears standing passively in the background. That video alone reportedly amassed over 6.2 million views on Instagram.

404 Media confirmed with FlickUp founder Jason Stacks that James' legal team was behind the cease-and-desist letter. Within 30 minutes of receiving it, Stacks said he decided to "remove all realistic people from Interlink AI’s software." Stacks also posted a video addressing the situation, captioned simply: "I’m so f**ked."

LeBron James is among a growing list of celebrities whose likenesses have been used without consent in disturbing AI-generated content. Pop star Taylor Swift has been repeatedly targeted with deepfake pornography, while Scarlett Johansson and Steve Harvey have both publicly condemned the misuse of their images and voiced support for legislation to curb it. However, James stands out as one of the first to take formal legal action against a company enabling this type of content through its AI tools.

Several bills are currently making their way through Congress to address the rise of nonconsensual AI-generated content. The recently passed Take It Down Act criminalizes the publication or threat to publish intimate imagery without consent, including deepfakes and AI-generated pornography. Two additional proposals — the NO FAKES Act of 2025 and the Content Origin Protection and Integrity from Edited and Deepfaked Media Act of 2025 — have also been introduced.

The NO FAKES Act focuses on preventing unauthorized AI replication of a person’s voice, while the latter seeks to safeguard original works and enforce transparency around AI-generated media.