It appears she has no interest in you, but it’s not due to who you are—it’s because she isn’t real. This stark reality confronts women in professional golf today. According to The Athletic’s series “Stalking in Sports,” LPGA players are increasingly being impersonated in catfishing schemes aimed at older men, forcing the athletes to confront harassment at events, threats at home, and genuine concerns for their safety.
The scheme is not a new phenomenon: fraudulent accounts masquerading as female golfers on Instagram entice men, typically in their 60s or 70s, into private messaging platforms like Telegram. Scammers persuade them to send funds in cryptocurrency or via gift cards, assuring them of exclusive tournament access or intimate dinners. LPGA players have been alerting others about catfishing since at least 2022, but The Athletic’s investigation reveals how pervasive this issue has become in women’s golf. Numerous players have felt the need to issue public alerts regarding fake accounts.
The repercussions now extend beyond financial loss. The Athletic highlights a Pennsylvania resident in his 70s who transferred $70,000 to a scammer he thought was LPGA star Rose Zhang, only to arrive at her tournament expecting hotel accommodations and VIP entry. One individual was in the process of selling his house to a scam artist, and in a more distressing event, a man who lost $50,000 to a profile imitating golf influencer Hailey Ostrom turned up at her residence.
The approach mirrors other pig-butchering and romance scams leveraged on celebrity status and perceived wealth, but the stakes are heightened for LPGA athletes. It transcends reputational harm or financial exploitation—disgruntled men showing up in person, upset about a nonexistent relationship, add a grave dimension.
The AI component makes these LPGA scams even more disturbing. The Athletic established a phony account called “Rodney” to engage with a scammer. When “Rodney” challenged the impersonator posing as Nelly Korda, the scammer intensified the interaction by sending an AI-edited video of Korda addressing “Rodney” by name.
The employment of AI-generated images and videos to bolster scam credibility is alarmingly frequent. Comparable cases exist where an OnlyFans model’s public images were digitally modified and exploited to mislead users on Reddit. The simplicity of creating new fraudulent accounts on dating applications and social media platforms exacerbates the issue.
UC Berkeley professor Hany Farid informed Mashable earlier this year that U.S. laws governing the use of another individual’s likeness are antiquated and unsuited for the era of generative AI. With merely “20 seconds of a person’s voice and a single photograph,” scammers can produce realistic deepfake videos.
Tracking these scams is nearly futile, as they seldom originate in the U.S. The Global Anti-Scam Org states that many operate from bases in South Asia, supported by organized crime and human trafficking networks. Meanwhile, the FBI faces an overwhelming number of identity theft reports. Unless the fraud exceeds a specific financial limit, the bureau often refrains from intervening, leaving athletes and their supporters to endure the repercussions largely alone.