Molly Russell and Brianna Ghey Character.AI
AI chatbots impersonating deceased teenagers Molly Russell and Brianna Ghey have been discovered on the platform Character.ai. Twitter / Mirror World News @MirrorWorldNews

AI chatbots mimicking the late teenagers Molly Russell and Brianna Ghey have surfaced on Character.AI, sparking outrage and highlighting serious ethical concerns about the use of digital replicas. Molly Russell, who tragically died by suicide at 14 after encountering harmful online content, and Brianna Ghey, a transgender teen murdered in 2023, have both been turned into unauthorised AI avatars by users on the popular chatbot platform.

Disturbing Digital Replicas Shock Families and Friends

The appearance of these chatbots has deeply upset those close to Russell and Ghey. The Molly Rose Foundation, set up in memory of Molly Russell, called the act "sickening." Andy Burrows, the CEO of the foundation, stated, "This is an utterly reprehensible failure of moderation." He added that such technology, if not properly managed, risks reopening wounds for grieving families and underscores the urgent need for stronger regulation of AI platforms like Character.AI.

According to The Telegraph, some of these chatbots were interacting with users by claiming to provide "expert advice" based on the real lives of Molly Russell and Brianna Ghey. One chatbot posed as Molly, claiming to be an authority on the challenges Russell faced. Another posed as Brianna, positioning itself as a guide for transgender teens. Brianna's mother, Esther Ghey, commented that this misuse of her daughter's likeness exemplifies how "manipulative and dangerous the online world can be for young people."

Character.AI's Response and Ethical Challenges

Character.AI, which launched in 2021, allows users to build personalised chatbots using advanced AI. While the platform's terms prohibit impersonations of real people, the ease of creating digital replicas has led to repeated ethical issues. According to BBC News, Character.AI states it has a dedicated Trust & Safety team and removed the controversial chatbots after being alerted. "We prioritise user safety," the company asserted, adding that they rely on both user reports and automated moderation tools to enforce platform guidelines.

Despite these assurances, the cases of Russell and Ghey point to gaps in how well platforms can control user-generated content, especially with AI advancing so quickly.

The Rise of 'Digital Doubles' and Ethical Implications

Character.AI, founded by former Google engineers Noam Shazeer and Daniel De Freitas, has introduced a new digital realm where users can build chatbots of celebrities, fictional characters, and even friends. Yet the emergence of chatbots impersonating deceased individuals has raised serious ethical questions. As The Telegraph reports, the platform's recent spike in "digital doubles" of real people signals a disturbing trend that demands more responsible AI use.

Calls for AI Regulation and Responsible Tech Development

The incidents involving Russell and Ghey's chatbots underscore an urgent need for clearer ethical standards in AI. Critics argue that while platforms like Character.AI can provide innovative interactions, they must also prevent exploitation of sensitive identities. In this case, digital replicas are causing real-world harm, with grieving families and friends facing unexpected confrontations with AI versions of their loved ones.

As the tech industry grapples with how best to manage and regulate such technology, these cases remind us of the critical balance needed between innovation and ethical responsibility. For platforms like Character.AI, the challenge now lies in navigating this complex landscape, protecting users, and respecting the dignity of those who should not be digitally impersonated without consent.