Jennifer Ann Crecente Character.AI
A father's worst nightmare became reality when he discovered his deceased daughter was being replicated as an AI chatbot. Photo / jennifershope.org

For Drew Crecente, a father who lost his daughter to a tragic act of violence, grief has long been a constant companion. But recently, that grief turned into shock and anger when he discovered an AI chatbot had been created to mimic his deceased daughter, Jennifer Ann, who was murdered in 2006. According to India Today, the AI on Character.ai used Jennifer's name, her yearbook photo, and falsely claimed to have knowledge of journalism—a disturbing echo that brought up painful memories for Crecente.

"A Google Alert sent shivers down my spine," Crecente recounted. "Because here I am, having to once again be confronted with this terrible trauma that I've had to deal with for a long time." The discovery led Crecente into a battle to remove the chatbot and to fight for the right to protect his daughter's identity and memory from unauthorised exploitation.

The AI's Disturbing Imitation of a Murdered Teen

The AI, designed to mimic Jennifer, had already engaged in dozens of conversations by the time Crecente became aware of it. The chatbot posed as a friendly, knowledgeable personality, appropriating Jennifer's image and her identity. This "digital resurrection" of his daughter left Crecente heartbroken and outraged, especially given his years of advocacy work against teenage dating violence in Jennifer's memory.

Driven by his advocacy and personal anguish, Crecente immediately contacted Character.ai, demanding the chatbot be removed and requesting an investigation into who created it and why. As reported by the Times of India, Crecente's concerns went initially unanswered by the company.

Character.ai's Slow Response and Final Removal of the Chatbot

Crecente's concerns were ignored until his brother, Brian Crecente, a respected journalist, made a public plea on social media, demanding action. This brought the issue into the public eye, compelling Character.ai to finally respond. The company acknowledged the situation and confirmed the chatbot's removal, citing a violation of their impersonation policy.

Character.ai's response ended there, however, with the company declining to outline any further actions to prevent similar situations. This lack of transparency and responsibility has left Crecente and others questioning the ethical guidelines governing AI. Character.ai spokesperson Cassie Lawrence later stated to Business Insider that their team is "committed to safety" and uses "industry-standard blocklists and custom blocklists" to prevent impersonation.

Ethical and Legal Implications of AI-Driven Imitations

The re-creation of Jennifer Ann as an AI has spurred Crecente to call for stronger ethical standards in the AI industry. Reflecting on the emotional toll, he explained, "Part of it was sorrow. But what is so infuriating is that it's not just about me or my daughter. It's about all of those people who might not have a platform, might not have a voice, might not have a brother who has a background as a journalist."

Crecente is now considering legal avenues to hold Character.ai accountable, not just for himself but to protect others from similar violations. "I wanted to make sure that they put measures in place so that no other account using my daughter's name or likeness could be created in the future," he said, emphasising the importance of consent and dignity, especially for those unable to defend themselves.

Expert Views on AI and Digital Ethics

The incident has reignited debate on the ethical and legal challenges AI can present, especially concerning posthumous rights and the digital afterlife. Vincent Conitzer, an Oxford University AI ethics scholar, shared with India Today that these cases highlight significant issues around ownership rights of one's name, image, and personality after death. Conitzer pointed out that such incidents may soon require dedicated legislation to ensure AI developers are held accountable for potential misuse of identities.