The National Eating Disorder Helpline Replaced Its Staff With a … – The Mary Sue

After helpline associates at the National Eating Disorder Association (NEDA) made the move to unionize, NEDAs response was to fire its entire staff and replace them with a chatbot. NEDAs helpline has been in service for 20 years and experienced a boom in the number of calls it received during the COVID-19 pandemic.

Even after the COVID-19 pandemic, the helpline is still experiencing an elevated level of calls, with an estimated 70,000 callers reaching out to NEDAs helplines in the past year alone. The confidential hotline provides a source of peer-to-peer support for those struggling with eating disorders. During the isolation of the pandemic, helplines like this were the only kind of human support that some callers had.

In contrast to that high demand, NEDAs hotline has a very small staff. Aside from volunteers, the hotline employs just six full-time staffers and a handful of supervisors. With the helpline seeing staggering increases in demand, staffers began to realize that their current system wasnt sustainable. These staffers were often tasked with training and supervising as many as 200 volunteers at one time.

Meanwhile, not only have the number of calls increased, but the severity has, as well. Staffers reported an increase of crisis-type calls and cases of child abuse or child neglect. Considering that the individuals taking these calls are often just volunteers and not professionals, ongoing training and supervision are vital.

So, four of the helplines staffers decided to unionize to ensure that NEDA provided them with a more safe and effective work environment. NEDAs reaction to that? A chatbot.

One of the staffers who instigated the unionization was Abbie Harper. In a blog post on Labor Notes, she explained the helpline workers reasonable demands and explained, We didnt even ask for more money. She said the union simply asked for better training programs, appropriate staffing, and opportunity for staffers to advance in their careers at NEDA. Four days after the union won an election for official recognition with the National Labor Relations Board, NEDA revealed during a virtual staff meeting that it was ending the helpline. By June 1, all of its staffers will be fired. Many volunteers will be let go, too, while others may be moved to other areas of NEDA.

In place of the helpline, NEDA is introducing a chatbot named Tessa. NEDA has claimed it is not a replacement for the helpline, but an entirely new program. The Tessa chatbot isnt even the same as the more sophisticated ChatGPT that has arisen recently. More sophisticated artificial intelligence, like ChatGPT, uses context to generate responses to allow it to sustain a human-like conversation.

However, a more dated chatbot like Tessa cant generate these more spontaneous responses. Instead, it has a limited number of pre-determined responses. It describes itself as a chatbot immediately and then might walk a user through a specific series of therapeutic techniques about something like body image. It is not a listening ear nor an open-ended tool, and it may not have a response to every question that callers have.

NEDA has reportedly already begun testing Tessa. Of 700 women who tested the chatbot, 375 gave the program an 100% helpful rating. The feedback of the other 325 women is not mentioned, though. Meanwhile, Harper has doubts about the ability of a chatbot to perform the same work that she and her colleagues did. One thing she and her colleagues have that a chatbot doesnt have is experience. Many of NEDAs helpline staffers and volunteers have recovered from eating disorders and have invaluable knowledge, support, and empathy to provide for those experiencing the same things they did.

NEDA VP Lauren Smolar defended the decision to replace its helpline with a chatbot because of legal liability. She explained the risks of having non-professional volunteers deal with crisis calls but didnt touch on the increased risks that come with having a machine potentially take crisis calls. With a chatbot, theres a strong possibility it wont have a response for someone in crisis, while with ChatGPT theres the chance of it going off the rails and potentially spewing harmful information. A chatbot might be a minor resource for people who are waitlisted for the helpline, but it simply cant replace callers who are looking to have a peer-to-peer conversation and are in desperate need of human support.

The rise in helpline demand clearly shows the value of human support, so it seems very strange for NEDA to respond to that by getting rid of its helpline entirely. This is why Harper states the move was merely about union busting and not at all about helping individuals. Chatbots and AI cant feel emotion, and this is where the potential for them to cause harm comes in. Even when callers are told that theyre speaking to a machine, it may feel like being confronted with yet another person who cant empathize with their struggles.

(featured image: Paramount Pictures)

Have a tip we should know? [emailprotected]

Go here to read the rest:
The National Eating Disorder Helpline Replaced Its Staff With a ... - The Mary Sue

Related Posts

Comments are closed.