ChatGPT Images 2.0 Is Becoming a Market Fraud Tool with Deepfakes

Deepfakes have shifted from a niche concern to a mass-market threat. May’s incidents show how consumer-grade tools now outpace any institutional response.

The damage extends into crypto. Scammers leverage artificial intelligence (AI) to create impersonation scams.

The Deepfake Economy Is Here, and Detection Is Losing

In early May 2026, AI-generated content showed up across politics, entertainment, and crime, as documented by Resemble AI. 

FBI Director Kash Patel posted a video that appeared to use AI to generate shots nearly identical to those in the Beastie Boys’ “Sabotage” music video. Furthermore, an AI video of mayoral candidate Spencer Pratt drew 4.1 million views on X.

Follow us on X to get the latest news as it happens

These tools aren’t just being used for viral content. They are also fueling real financial harm. A Chicago man lost $69,000 to a scammer who flashed an AI-generated US Marshals badge on a video call. 

Meanwhile, the Atlantic’s Lila Shroff found that OpenAI’s ChatGPT Images 2.0 can generate fake IDs, prescriptions, receipts, bank alerts, and news screenshots.

“All of this makes it even harder for banks, hospitals, government agencies, and the like to prevent fraud,” Shroff wrote.

404 Media exposed Haotian AI, a Chinese real-time deepfake software. Reporter Joseph Cox swapped faces on a live Teams call using this, proving the technology is functional, for sale, and already being used against real victims.

“Three of this week’s stories, Haotian AI, the Meloni deepfake, and the Patel FBI video, come from completely different categories and geographies, but they share a structural condition: the tools used to produce the harm are consumer-grade, widely available, and improving faster than any institutional response. Haotian AI costs a few hundred dollars and works on Teams. ChatGPT Images 2.0 is a subscription product,” Resemble AI said.

Crypto Also Bears the Cost

Crypto has become a prime target for AI-driven deception. According to Chainalysis, fraudsters are now pairing deepfakes, face-swap apps, and large language models with classic romance and investment cons, and the math favors them.

The average AI-assisted crypto scam nets roughly $3.2 million, about 4.5 times the haul of a conventional scheme. Several cases underline the threat. In August 2025, attackers stole $2 million by impersonating the founder of Plasma. 

BeInCrypto has also reported on North Korean operatives running deepfake video calls on Zoom. Together, these incidents mark AI-powered impersonation as one of the sector’s most pressing security risks.

Subscribe to our YouTube channel to watch leaders and journalists provide expert insights

The post ChatGPT Images 2.0 Is Becoming a Market Fraud Tool with Deepfakes appeared first on BeInCrypto.

Leave a Reply

Your email address will not be published. Required fields are marked *

UP NEXT

Related Tags

Loading RSS Feed

You May Like

Subscribe To Our Newsletter

Metus in ac vivamus dui id purus in risus. Nunc fringilla donec amet pulvinar vivamus suscipit. Augue porttitor eu sed proin tortor bibendum facilisis felis. Nunc egestas tellus nisl tempor aliquet malesuada ali eu sed proin tortor bibendum facilisis felis
Stay Updated by our Monthly / Weekly News Update. Zero Spamming. Terms & Condition Applied