AI chatbot was impersonating dead teens

Chat GPT min

Character.ai, a chatbot tool in which people can create and interact with AI avatars, has found itself in hot water after digital versions of teenagers Molly Russell and Brianna Ghey have been found on its platform.

Despite Character.ai’s terms of service banning any use of the platform to “impersonate any person or entity”, the firm has been criticised by the Molly Rose Foundation, set up in Molly’s memory to remove harmful content from social media, for an “utterly reprehensible failure of moderation.”

Molly Russell took her own life when she was just 14 following vast exposure to social media content that idealised suicide and self-harm. Brianna Ghey was murdered by two fellow teenagers in a brutal attack in 2023.

Although Character.ai has claimed to have deleted the chatbots when they became aware of them, Esther Ghey, Brianna’s mother, told the Telegraph that is was further evidence of how “manipulative and dangerous” the online world could be.

The Molly Rose Foundation led calls for more effective regulation arguing this incident “vividly underscores why stronger regulation of both AI and user-generated platforms cannot come soon enough”.

Character.ai currently uses automated tools and use reports to detect cases in which its rules might have been broken and is working on developing a “trust and safety” team.

Whilst it argues that one of its core principles is that its “product should never produce responses that are likely to harm users or others”, this is not the first incident landing the firm in trouble.

Character.ai recently made headlines after a 14-year-old boy committed suicide after becoming obsessed with a chatbot on the platform.

Share