top of page

Is there a place for AI in eating disorder treatment?


This headline is probably one of hundreds you’ve seen in recent months about artificial intelligence. Right now, as AI technologies advance at an exponential rate, the possibilities of transforming the ways we interact with information and the world around us seem endless. However, alongside its potential, the application of AI also comes with warning signs. Let’s explore how these complexities play out in the realm of eating disorder treatment.


Current enthusiasm for AI is palpable, with statistics showing that a substantial portion of internet users prefer chatbots for simple inquiries. Consumers and businesses alike are drawn to AI for its efficiency, affordability, and 24/7 availability. The healthcare sector is no exception to this trend, as experts predict that up to 73% of healthcare administrative tasks could be automated by AI in the near future.


When it comes to eating disorder treatment, AI can certainly be a powerful tool in streamlining processes, easing strains on mental health practitioners, and removing barriers to entry for care. With the right supervision in place, it can help send supportive nudges to individuals in treatment, offer tailored resources and treatment options, assist dieticians in creating meal plans, and more.

However, the mental healthcare field must remain vigilant about AI's potential limitations and consequences. One such possibility was brought to life in May 2023, when the US National Association for Eating Disorders (NEDA) replaced its human helpline staff with a chatbot named Tessa. Within a week, it was reported that Tessa gave harmful advice that promoted dieting and weight loss behaviours to those who struggled with an eating disorder. The bot was consequently shut down.


How could this happen? Chatbots are trained on massive sets of datasets that might contain prevalent yet problematic ideas from our society’s diet culture about food, weight, and body image. This can result in responses that reflect weight bias, fatphobia, and the promotion of unhealthy ideals, potentially causing harm to individuals already struggling with these issues.



And there are many other reasons AI chatbots are not fit to provide support for someone struggling within the spectrum of eating disorders and disordered eating. These conditions are nuanced and can vary person to person—they’re often characterized by hidden or unconventional symptoms. Effective treatment requires highly personalized approaches, considering that most individuals with eating disorders have co-occurring mental health conditions. Unfortunately, the rigid nature of AI algorithms means they might overlook non-standard patterns of symptoms or misdiagnose individuals whose symptoms do not conform to a predefined mold. Additionally, AI chatbots can occasionally provide false information, posing a significant risk to patients seeking accurate guidance for eating disorder-related concerns. These bots are designed to provide information validated through their training data, but they can generate "hallucinations" or fabricated information, which can be detrimental when assessing risks, treatment options, or other critical aspects of eating disorder care.



Certainly, eating disorder treatment requires a human touch that AI chatbots cannot replicate. Beyond the initial focus on behavioural modifications, individuals often need support, supervision, and empathy to navigate their recovery journey effectively. For someone suffering from eating disorder symptoms, connecting with individuals who have experienced recovery themselves can be a crucial component of the healing process. Research demonstrates that the involvement of loved ones significantly improves recovery outcomes, and clinical guidelines outline the importance of the role of family members in recovery.


Chatbots, despite their warp speed and intelligence, lack the emotional depth and nuanced expert knowledge necessary for effective eating disorder care. And while AI is not inherently negative, a nuanced approach to applying technology in aiding care work has always been essential.


A note from our team If you or someone you know is facing disordered eating or body image challenges, this is deserving of attention and support. Using our chatbot on the bottom right (or by clicking here), you can connect with our dedicated humans to ask your questions, share part of your story, and determine if the Kyla Fox Centre is the right place for support.


Comentários


bottom of page