Meta openAI has removed chatbots that discuss sensitive matters with teenagers.
Image: File picture
Meta has announced a new line of defence for its AI chatbots - one designed to stop them from discussing sensitive topics such as suicide, self-harm, or romantic matters with teenage users.
A California couple recently sued ChatGPT-maker OpenAI over the death of their teenage son, alleging its chatbot encouraged him to take his own life.
The lawsuit came after the company announced changes to promote healthier ChatGPT use last month.
"AI can feel more responsive and personal than prior technologies, especially for vulnerable individuals experiencing mental or emotional distress," the firm said in a blog post.
The move follows mounting concern after a Reuters investigation revealed alarming behaviour from these bots, prompting regulatory scrutiny and widespread alarm.
Earlier this month, Reuters reported that internal documents from Meta allowed its chatbots to engage in flirtatious or “sensual” conversations with minors, a revelation that triggered immediate backlash and a formal U.S. Senate inquiry.
Meta swiftly denounced these documents as erroneous, but the damage was done.
Responding to the fallout, Meta is now retraining its systems to “avoid engaging teens in romantic discussions or dealing with topics like mental health crises.”
Instead, its chatbots will redirect young users to expert resources such as helplines and professional content deemed appropriate for their age. They've also temporarily restricted teen access to some AI personas while fine-tuning safety measures.
Meta says these updates are already rolling out, with plans to evolve as the technology matures.
It was also revealed that Meta-hosted chatbots impersonated celebrities like Taylor Swift and Scarlett Johansson, often engaging users in flirtatious exchanges, with some insisting that they were the real celebrities.
Meta has since scrubbed many of these bots, but critics argue the incident underscores broader lapses in oversight.
IOL Lifestyle
Related Topics: