AI Performers and Deepfake Technology: Already Featured in YouTube Advertisements


For 40 years, Henry and Margaret Tanner have been crafting leather footwear in Boca Raton, Florida. “No shortcuts, no inferior materials, just genuine, high-quality workmanship,” declares Henry in a YouTube advertisement for Tanner Shoes.

Astonishingly, Henry has accomplished this feat despite having a deformed hand, and Margaret is shown with only three fingers in a picture on their site.

I found Tanner Shoes via YouTube advertisements. As a writer focused on men’s fashion, I was captivated by these custom shoemakers. In a typical advertisement, an older gentleman, likely Henry, appears over footage of “handmade” shoes, stating, “They don’t manufacture them like they used to, but for 40 years we have…Customers have told us our shoes possess a timeless appeal and are worth every cent. However, now you won’t need to spend much at all because we’re retiring. For the first and final time, every single pair is 80 percent off.”

I have my doubts that the Tanner Shoes “retirement” sale is genuine, much like the images of Henry and Margaret Tanner. Beyond this advertisement, I found no digital footprint for them or proof of Tanner Shoes in Boca Raton. I contacted Tanner Shoes for verification but did not receive any reply.

Reddit users have come across comparable YouTube ads for other fictitious family-run businesses, suggesting these misleading advertisements are not unique. One Redditor mentioned encountering similar ads in German featuring an AI grandmother closing her jewelry shop. After I questioned the Tanner Shoes ads, YouTube suspended the advertiser’s account for violating policy guidelines.

These advertisements form part of a trend involving AI-generated content in YouTube promotions. AI video ads are also visible on Instagram and TikTok, but my focus remained on YouTube, which is under Google’s ownership.

While AI possesses valid applications in marketing, many AI video advertisements on YouTube are misleading, crafted to fool viewers into purchasing leather shoes or diet pills. Reliable statistics on AI fraud are scarce, but the FBI issued a warning in 2024 indicating a rise in AI-fueled cybercrime. According to a Bolster.ai report, online scams and phishing have surged by 94 percent since 2020.

AI tools can rapidly create realistic videos, images, and audio. Scammers can effortlessly generate AI “actors” for their promotional content.

In another AI video advertisement I reviewed, an AI actor impersonates a financial analyst. I received this advertisement repeatedly, as did numerous users on Reddit and LinkedIn.

In this video, the financial analyst asserts, “I’m likely the only financial advisor who discloses all his trades online,” and “I’ve succeeded in 18 of my last 20 trades.” Simply click the link to join a covert WhatsApp group. Other AI actors pledge weight loss secrets (“I shed 20 pounds using just three ingredients I already had in my fridge!”). Some are celebrity deepfakes.

I was taken aback to see former Today host Hoda Kotb endorsing dubious weight loss methods on YouTube. “Ladies, the new viral recipe for pink salt was highlighted on the Today show, but for anyone who missed it, I’m here to share this 30-second trick. As a single mom of two, I tried the pink salt method to lose weight quickly, but I had to stop because it was melting too swiftly.”

This phony Kotb guarantees that the weight loss trick is genuine. “This is the same recipe used by Japanese celebrities to get slim. When I first encountered this method, I was skeptical as well. Harvard and Johns Hopkins confirm it’s 12 times more effective than Mounj…If you don’t eliminate at least four chunks of fat, I will personally buy you a case of Mounjaro pens.”

Clicking the ad reveals more celebrity deepfakes and dubious customer “testimonials.” The video concludes with a promotion for Exi Shred diet pills. Kotb’s representatives did not respond to a request for comment, but I uncovered the original footage used to produce this deepfake. The authentic video was uploaded on April 28 on Instagram, and it featured in AI video ads by May 17.

Kotb is another victim of AI deepfakes, sophisticated enough to evade YouTube’s advertising review process.

At times, AI creations may appear authentic initially, but there’s frequently a clue. The Kotb deepfake employed an altered genuine video, causing the fake Kotb to replicate the same expressions and movements. Another indicator? AI impersonators often mispronounce common words.

The AI financial analyst vows to livestream trades on Twitch, mispronouncing livestream as “give-stream.” In weight loss videos, AI actors stumble over phrases like “I lost 35 lbs,” awkwardly saying “ell-bees.” I have witnessed fake Elon Musks pronounce “DOGE” as “doggy” in cryptocurrency scams.

Nonetheless, there’s not always a tell.

Upon starting my investigation into AI video ads on YouTube, I examined every performer I encountered. Distinguishing between an airbrushed model and a polished AI creation or differentiating a poor