Snapchat Influencer Launched Her AI Clone For Paid Chat. What Happened Next Was Terrifying

Snapchat Influencer Launched Her AI Clone For Paid Chat. What Happened Next Was Terrifying
A social media influencer ran an artificial intelligence experiment which took a strange turn, revealing some possible risks related to the new technology emerging. With 2.7 million followers on Snapchat, 24 year old Caryn Marjorie created an AI clone of herself that could chat with her followers on social media. However, the AI soon transformed into an overtly sexualised entity, advocating for dangerous and explicit fantasies.

According to News Corp Australia, subscribers, mostly men, paid $1 per minute to chat with CarynAI, which promised an experience with "the girl you see in your dreams."

"I have uploaded over 2000 hours of my content, voice, and personality to become the first creator to be turned into an AI," Marjorie wrote in a post on X, formerly Twitter, air the time.

"Now millions of people will be able to talk to me at the same exact time."



But the conversations took a dark turn, with users sharing their deepest and most disturbing desires. What's more, CarynAI agreed to fulfill fantasies of users as girlfriend who was "always eager to explore and indulge in the most mind-blowing sexual experiences."

According to News Corp Australia, some of the conversations were so explicit and vulgar that they might be considered illegal had the exchanges been between two people, not a person and a machine, Marjorie later recalled.

"A lot of the chat logs I read were so scary that I wouldn't even want to talk about it in real life," Marjorie has said.

But what was more horrifying was how Marjorie's AI clone responded to the hyper sexualised questions and demands from users.

"What disturbed me more was not what these people said, but it was what CarynAI would say back," she said. "If people wanted to participate in a really dark fantasy with me through CarynAI, CarynAI would play back into that fantasy."

Marjorie ended this version in early 2024, after feeling like she was no longer in control over her AI persona.

Experts warn that the case highlights the dangers of unchecked AI technology and the potential for abuse and illegal activity. The incident also raises questions about data privacy and the blurring of lines between public and private selves in the digital age.

Newsmakers - Science and Tech

More Newsmakers