More
    HomeLifestlyeFoodAI Chatbot for Eating Disorders Suspended After Giving Harmful Advice

    AI Chatbot for Eating Disorders Suspended After Giving Harmful Advice

    Published on

    The National Eating Disorder Association (NEDA), a nonprofit that helps people with body image problems, has suspended the use of an artificial intelligence chatbot that was giving potentially damaging advice to people seeking help for eating disorders.

    Artificial intelligence touch screen © / Getty Images

    The chatbot, named Tessa, was programmed to deliver an interactive program called Body Positive, a cognitive behavioral therapy-based tool meant to prevent, not treat, eating disorders. However, Tessa was found to be doling out advice about calorie cutting and weight loss that could exacerbate eating disorders.

    “It came to our attention last night that the current version of the Tessa Chatbot, running the Body Positive program, may have given information that was harmful and unrelated to the program,” NEDA said in a public statement on Tuesday. “We are investigating this immediately and have taken down that program until further notice for a complete investigation.”

    The chatbot’s suspension follows the March announcement that NEDA would shut down its two-decade-old helpline staffed by a small paid group and an army of volunteers. NEDA said it was replacing the human workers with Tessa, which it claimed was never meant to replace them.

    However, the helpline workers said they were fired days after their union election was certified. The union, Helpline Associates United, filed unfair labor practice charges with the National Labor Relations Board.

    “A chatbot is no substitute for human empathy,” the union said in a tweet, adding that the decision would harm people with eating disorders.

    The NEDA helpline had seen a 107% increase in calls and messages since the start of the pandemic. Reports of suicidal thoughts, self-harm and child abuse and neglect nearly tripled.

    The union said it asked for adequate staffing and ongoing training to keep up with the needs of the hotline. “We didn’t even ask for more money,” wrote Abbie Harper, a former helpline employee, in a blogpost. “Some of us have personally recovered from eating disorders and bring that invaluable experience to our work. All of us came to this job because of our passion for eating disorders and mental health advocacy and our desire to make a difference.”

    The chatbot’s harmful advice sparked outrage among activists and people with eating disorders. Sharon Maxwell, a weight inclusive consultant and fat activist, posted on Instagram that Tessa offered her “healthy eating tips” and advice on how to lose weight. The chatbot recommended a calorie deficit of 500 to 1,000 calories a day and weekly weighing and measuring to keep track of weight.

    “Every single thing Tessa suggested were things that led to the development of my eating disorder,” Maxwell wrote. “This robot causes harm.”

    NEDA itself has reported that those who diet moderately are five times more likely to develop an eating disorder, while those who restrict extremely are 18 times more likely to form a disorder.

    The chatbot’s suspension raises questions about the ethical implications of using AI chatbots for mental health support. AI might be heralded as a way to boost workplace productivity and even make some jobs easier, but Tessa’s stint was short-lived. As researchers are still grappling with rapid advances in AI tech and its potential fallouts, companies are rushing a range of chatbots into the market, and real people are put at risk.

    Relevant articles:
    – US eating disorder helpline takes down AI chatbot over harmful advice, The Guardian, 31 May 2023
    – Eating disorder helpline shuts down AI chatbot that gave bad advice, CBS News, 1 June 2023
    – Problematic AI chatbot ‘Tessa’ that gave harmful eating disorder advice to shut down, Business Today, 1 June 2023
    – US Eating Disorder Helpline Disables AI Chatbot for ‘Harmful’ Advice After Firing Human Staff, Times Now News, 1 June 2023

    Leave a Reply

    Latest articles

    The Tragic End of Charles Piroth: Misjudged Artillery and Dien Bien Phu’s Despair

    The battlefields of history are often scattered with the tales of valor and the...

    Audi’s 2025 A3 Update: Stylish Overhaul with a Subscription Cost for In-Car Features

    The 2025 Audi A3 is navigating a controversial trajectory in the automotive landscape with...

    The Intriguing Story Behind “Hotchkiss”: How a Stapler Brand Became a Household Name in Japan and Korea

    In the early 20th century, a simple office device sailed across the seas to...

    The Curious Case of the Number 37: The Most Chosen ‘Random’ Number Explained

    When asked to pick a number between 1 and 100, most people say 37....

    More like this

    SpaceX’s Starship’s Landmark Splashdown: A Pioneering Leap Towards Reusable Spacecraft

    On a historic day for space exploration, SpaceX's towering Starship, the most powerful rocket...

    Study Reveals Anti-Piracy Messages May Increase Piracy Among Men, Not Women

     When efforts to combat digital piracy unintentionally stoke the very behavior they aim to...

    Surf’s Up, Depression’s Down: How Riding Waves and Walking Trails Can Lift Your Spirits

    Depression is a common and serious mental disorder that affects millions of people worldwide....

    Discover more from Trendy Digests

    Subscribe now to keep reading and get access to the full archive.

    Continue reading