The National Eating Disorder Association (NEDA), a nonprofit that helps people with body image problems, has suspended the use of an artificial intelligence chatbot that was giving potentially damaging advice to people seeking help for eating disorders.
The chatbot, named Tessa, was programmed to deliver an interactive program called Body Positive, a cognitive behavioral therapy-based tool meant to prevent, not treat, eating disorders. However, Tessa was found to be doling out advice about calorie cutting and weight loss that could exacerbate eating disorders.
“It came to our attention last night that the current version of the Tessa Chatbot, running the Body Positive program, may have given information that was harmful and unrelated to the program,” NEDA said in a public statement on Tuesday. “We are investigating this immediately and have taken down that program until further notice for a complete investigation.”
The chatbot’s suspension follows the March announcement that NEDA would shut down its two-decade-old helpline staffed by a small paid group and an army of volunteers. NEDA said it was replacing the human workers with Tessa, which it claimed was never meant to replace them.
However, the helpline workers said they were fired days after their union election was certified. The union, Helpline Associates United, filed unfair labor practice charges with the National Labor Relations Board.
“A chatbot is no substitute for human empathy,” the union said in a tweet, adding that the decision would harm people with eating disorders.
The NEDA helpline had seen a 107% increase in calls and messages since the start of the pandemic. Reports of suicidal thoughts, self-harm and child abuse and neglect nearly tripled.
The union said it asked for adequate staffing and ongoing training to keep up with the needs of the hotline. “We didn’t even ask for more money,” wrote Abbie Harper, a former helpline employee, in a blogpost. “Some of us have personally recovered from eating disorders and bring that invaluable experience to our work. All of us came to this job because of our passion for eating disorders and mental health advocacy and our desire to make a difference.”
The chatbot’s harmful advice sparked outrage among activists and people with eating disorders. Sharon Maxwell, a weight inclusive consultant and fat activist, posted on Instagram that Tessa offered her “healthy eating tips” and advice on how to lose weight. The chatbot recommended a calorie deficit of 500 to 1,000 calories a day and weekly weighing and measuring to keep track of weight.
“Every single thing Tessa suggested were things that led to the development of my eating disorder,” Maxwell wrote. “This robot causes harm.”
NEDA itself has reported that those who diet moderately are five times more likely to develop an eating disorder, while those who restrict extremely are 18 times more likely to form a disorder.
The chatbot’s suspension raises questions about the ethical implications of using AI chatbots for mental health support. AI might be heralded as a way to boost workplace productivity and even make some jobs easier, but Tessa’s stint was short-lived. As researchers are still grappling with rapid advances in AI tech and its potential fallouts, companies are rushing a range of chatbots into the market, and real people are put at risk.
– US eating disorder helpline takes down AI chatbot over harmful advice, The Guardian, 31 May 2023
– Eating disorder helpline shuts down AI chatbot that gave bad advice, CBS News, 1 June 2023
– Problematic AI chatbot ‘Tessa’ that gave harmful eating disorder advice to shut down, Business Today, 1 June 2023
– US Eating Disorder Helpline Disables AI Chatbot for ‘Harmful’ Advice After Firing Human Staff, Times Now News, 1 June 2023