AI Girlfriends Are Harvesting Your Most Private Info

By Bradley Gammel | Updated

ai girlfriend

Those looking for love and companionship post-Valentine’s Day might want to rethink talking to one of those Ai girlfriends from the app store. A recent Mozilla’s *Privacy Not Included project, from Mozilla, shares that AI girlfriends and boyfriends are harvesting private information from users. The popularity of these romantic chatbots has grown in recent months, and shocking news reveals that they share or sell users’ information. 

Not Your Friends

“To be perfectly blunt, AI girlfriends and boyfriends are not your friends,” said Misha Rykov, a Mozilla Researcher, in a press statement.

“Although they are marketed as something that will enhance your mental health and well-being, they specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you.”

Privacy Not Included

ai warfare

Mozilla conducted a deep dive into top chatbots, including Replika, Chai, Romantic AI, EVA AI Chat Bot & Soulmate, and CrushOn.AI.

The most exciting part of all these AI girlfriends is that they showcase the Privacy Not Included label. Mozilla reports that these popular chatbots are placed amongst some of the worst apps to use when it comes to privacy. 

Collecting And Sharing Data

artificial intelligence ai art

According to Mozilla, these AI girlfriends’ “disturbing new way” is alarming and uncontrollable.

They shared an example of CrushOn.AI collecting users’ sexual history, sexually transmitted diseases, gender orientation, and more.

Mozilla claims that 90% of these AI girlfriend apps are designed to collect and share data with companies for marketing purposes. Only one of these AI partners and chatbots based on Mozilla’s bare minimum standards was Genesia AI Friend & Partner.

Trackers In Apps

One notable discovery emerged during Mozilla’s investigation: the prevalence of trackers within these applications. Trackers are small codes designed to gather data and transmit it to other companies for various purposes, including advertising.

According to Mozilla’s findings, the AI girlfriend apps averaged 2,663 trackers per minute.

However, the number spiked significantly with Romantic AI, which activated 24,354 trackers within just one minute of usage.

The proliferation of AI girlfriends represents a burgeoning trend in the intersection of technology and human relationships, but it comes with significant risks and drawbacks.

One concerning aspect is the reinforcement of harmful gender stereotypes and unrealistic expectations of relationships. These AI companions often embody idealized notions of femininity, perpetuating narrow and distorted portrayals of women. By promoting these limited and often objectifying depictions, AI girlfriends can contribute to societal inequalities and perpetuate harmful gender norms.

Companionship And Emotional Support


The reliance on AI girlfriends for companionship and emotional support threatens genuine human connections. While these artificial companions may offer a semblance of intimacy, they ultimately detract from authentic social interactions and emotional bonds.

By substituting genuine relationships with artificial ones, individuals risk isolating themselves from meaningful connections essential for mental and emotional well-being. This shift away from genuine human interaction could lead to increased social isolation and emotional detachment, exacerbating issues of loneliness and mental health.

Booming In Popularity

ai voice actors

The rise of AI girlfriends has boomed in recent months, largely thanks to advertising. EVA AI & Chatbot uses advertisements on TikTok and YouTube using adult film star Eva Elfie in the videos. The clips showcase the adult actress wandering around city landscapes to catch the viewer’s attention with the possibility of talking to a robotic version of the actress.