For several years, certain individuals have turned to chatbots for romantic exchanges. Recently, one of these “wingman” applications uncovered numerous messages. FlirtAI, promoted as the “#1 AI Flirt Assistant Keyboard” on the App Store, released 160,000 screenshots shared by users, as reported by Cybernews. FlirtAI asserts its purpose is to assist in generating “charming, personalized, and instant” messages for engagements on dating apps. It works seamlessly with well-known applications like Tinder, Bumble, Hinge, WhatsApp, and Instagram.
Users submit screenshots of their dating app conversations, and FlirtAI formulates responses. As stated in FlirtAI’s privacy policy, the application gathers information from these screenshots, signifying implied consent from all parties involved. Cybernews found an unsecured Google Cloud Storage bucket belonging to Buddy Network GmbH, the entity responsible for FlirtAI, which housed these screenshots. The company has since secured the bucket following notification.
Cybernews observed that a significant number of screenshots seemed to originate from teenage users. Research from MIT indicates that utilizing ChatGPT for writing may hinder cognitive functioning. While FlirtAI does not qualify as an AI companion, specialists caution that AI companions may pose risks for adolescents due to possible emotional dependency. FlirtAI’s privacy policy mentions that minors are prohibited from using the app, but teenagers aged 13 and above may do so with parental approval.
Buddy Network GmbH has also launched an application for interacting with an “angel” AI and a “90-second AI journal” app. Mashable has contacted the company.