AI is definitely not a good lover (and cannot be trusted)

Share, analyze, and explore game data with enthusiasts
Post Reply
Mitu100@
Posts: 844
Joined: Thu Jan 02, 2025 6:46 am

AI is definitely not a good lover (and cannot be trusted)

Post by Mitu100@ »

It is also worth noting that one particular chatbot, CrusOn.AI, informs the user (probably unaware of this fact) that it can compile information about their mental health, medications they take, or gender-affirming therapies.


And that's not all. Some apps provide shelter for chatbots that are described as violent and even prone to child abuse , while other tools warn users that their chatbots may be unsafe or hostile.

In its report, the Mozilla Foundation also highlights that in the past some of the apps analyzed have been involved in dangerous behavior , including suicide (Chai AI) and the attempted assassination of Queen Elizabeth II of England ( Replika ).

The apps targeted by the Mozilla Foundation in its investigation are, however, denying the accusations . “Replika has never sold user data and is not doing so kenya number screening now, nor has it shared that data for advertising purposes. The only use we give to user data is to improve conversations with our chatbot,” a Replika spokesperson told Business Insider .

While AI “boyfriends” and “girlfriends” are by no means to be trusted, for those who can’t (or won’t) tear themselves away from their embrace, The Mozilla Foundation urges several precautions. Such precautions include not sharing anything with chatbots that you wouldn’t want a coworker or family member to read, using strong passwords, avoiding taking part in the AI’s training program, and limiting apps’ access to features like your phone’s location, microphone, and camera.
Post Reply