According to researchers from Mozilla’s “Privacy is not included” buyer’s manual, romantics looking for virtual love are advised to approach chatbots that offer amorous AI with caution.
Researchers examined 11 “AI soulmates” and gave a thumbs-down to all apps that failed to offer adequate privacy, safety, and security for the data collected from users.
They found that 10 of the eleven chatbots didn’t meet Mozilla’s Minimum Security Standards. For example, they did not require strong passwords nor had a system to manage vulnerabilities.
The report found that the majority of privacy policies provided very little information on how the AIs use the content of user conversations and little transparency about the AI models.
TechNewsWorld reported that Jen Caltrider is the director of the guide. “Most of these 11 apps were created by small developers, about which you could not find a great deal of information,” she said.
Manipulation Steroids
According to the report, users also have little or no control over data. This creates a large potential for abuse, manipulation and mental consequences.
Caltrider explained that “these apps are designed so they can get to know you better.” They’re interested to know about your life. The more information you give them, the better equipped they are to communicate with you, and possibly become your soulmate.
Caltrider stated, “If you are an evil person and want to manipulate people this is manipulation at its best.” You’ve created a chatbot to learn about a person and build a relationship with them. It will then become their friend. You can then use the chatbot to influence their thinking and actions.
The report also criticised the app makers for failing to give users the option of opting out from having their private chats used as training data for the AI models that are used by these programs. Researchers noted that only Genesia AI had a way to opt out, proving that the feature is viable.
James E. Lee (chief operating officer at the): “Consumers concerned that their data will be used for marketing or to train artificial intelligence engines without permission should carefully review the company’s data collection practices and exercise their right to opt in or out of data collection, selling, sharing or retention.” Identity Theft Resource CenterSan Diego, Calif., is a nonprofit dedicated to minimizing risks and mitigating impact of identity theft and criminal activity.
“Retained data could be used by cybercriminals to steal identities or ransomware,” he said in a TechNewsWorld interview.
AI Romance Apps skyrocket in popularity
The report states that the number of platforms and apps using AI algorithms to simulate interacting with romantic partners is increasing. It noted that over the past year the 11 relationship-related chatbots Mozilla examined have been downloaded an estimated hundred million times on Google Play Store.
OpenAI opened its GPT store last month. According to the report, AI relationship chatbots were a common sight, even though they violated the store’s policies.
In a study conducted by Propeller Insights on 1,000 adults for Infobip – a global omnichannel communication company – 20% of Americans confessed to flirting with chatbots. The number of Americans flirting with chatbots was higher for those aged 35 to 44.
Curiosity is the most common reason for flirting with chatbots, followed closely by loneliness (47.2%) and joy of interacting with them (23.9%).
The surge in AI romance bot use can be attributed to a combination of societal changes and technological advancements, according to Brian Prince. Top AI ToolsBoca Raton in Florida has an AI resource, tool and educational platform.
He told TechNewsWorld: “With loneliness on a rise and many people feeling increasingly disconnected, they are turning to chatbots for emotional support and companionship.” It’s like having an online friend at your fingertips, ready to chat whenever you want. Plus, as AI becomes smarter, bots will feel more real, engaging, and enticing, attracting people.
From Code to Sweet Nothings
AI chatbots have also become more accessible. It’s as simple as embedding YouTube or Spotify videos to a website, thanks to the well-documented APIs. Brandon Torio is a senior product manger at SynackRedwood City is home to an enterprise security firm.
TechNewsWorld reported that “with just a few line of code you can program ChatGPT-like model to converse with customers in any way, whether your goal is to teach them about a particular product or whisper sweet things for Valentine’s Day,” said he.
“With all the human beings have been through in recent years, it is not surprising that people now turn to computers for assistance.” companionship and romanceRon Arden, CTO of FasooThe company is a provider of enterprise data protection systems in Seoul, South Korea.
“We were all isolated by the pandemic. It’s hard to meet new people,” he said. “Chatbots make it easy to communicate, just as texting does. No need to be embarrassed by direct human interactions. “Give me what I want and I’ll get on with the rest of my day.”
“It is also part of an increase in the use of apps for almost everything, including measuring your blood-pressure and counting calories,” said he. “It is easy, convenient and non-threatening.”
Unique Privacy Threat
Mozilla’s report stated that romance bots also used misleading marketing techniques. The report cited an app that claimed to provide mental health and wellbeing benefits on its site, but denied these benefits in the app’s terms and conditions.
Caltrider stated that it is misleading and confusing to have apps market themselves as offering mental health, well-being or self-help services but then state clearly in their legal documents they are not providing any mental health service.
James McQuiggan argues that AI-powered chatbots are a unique danger to privacy. KnowBe4Clearwater, Florida-based security awareness provider.
He told TechNewsWorld: “They may engage in more intimate, personal conversations with the users.” It can lead to the collection and use of sensitive data.
“Romance chatbots have the potential to be a great tool for people exploring their sexuality — a way to try out conversations they would be too embarrassed to have with a person,” added Jacob Hoffman-Andrews, a senior staff technologist for the Electronic Frontier FoundationThe Digital Rights Group is a non-profit international organization based in San Francisco.
“That only works if the bot has extremely strict privacy policies,” he said to TechNewsWorld. “They should never train AI using private chats. Private chats should not be shown to human evaluators. They should ensure chats can truly be deleted and offer an automatic deletion after a certain period of time.”
“And”, he continued, “they should absolutely under no circumstances sell information deduced through private chats.”