National Eating Disorder Association yanks chatbot that replaced human helpline staff after users said it gave harmful advice

Date:

Share:

[ad_1]

“It came to our attention [Monday] night that the current version of the Tessa Chatbot … may have given information that was harmful,” NEDA said in an Instagram post. “We are investigating this immediately and have taken down that program until further notice for a complete investigation.”

The Chatbot was set to completely replace human associates on the organization’s hotline on June 1. It’s unclear how the organization plans to staff that helpline at this point.

The problems with Tessa were made public by an activist named Sharon Maxwell, who said: “Every single things Tessa suggested were things that led to the development of my eating disorder.” NEDA officials initially called those claims a lie in a social media post, but deleted it after Maxwell sent screenshots of the interaction, she said.

Alexis Conason, a psychologist who specializes in treating eating disorders, was able to recreate the issues, posting screenshots of a conversation with the chatbot on Instagram.

“Imagine vulnerable people with eating disorders reaching out to a robot for support because that’s all they have available and receiving responses that further promote the eating disorder,” she wrote.

NEDA introduced Tessa after the hotline staff decision to unionize following a slew of pandemic-era calls led to mass staff burnout. The six paid employees oversaw a volunteer staff of roughly 200 people, who handled calls (sometimes multiple ones) from nearly 70,000 people last year.

NEDA officials told NPR the decision had nothing to do with the unionization. Instead, said Vice President Lauren Smolar, the increasing number of calls and largely volunteer staff was creating more legal liability for the organization and wait times for people who needed help were increasing. Former workers, however, called the move blatantly anti-union.

The creator of Tessa says the chatbot, which was specifically designed for NEDA, isn’t as advanced as ChatGPT. Instead, it’s programmed with a limited number of responses meant to help people learn strategies to avoid eating disorders.

“It’s not an open-ended tool for you to talk to and feel like you’re just going to have access to kind of a listening ear, maybe like the helpline was,” Dr. Ellen Fitzsimmons-Craft, a professor of psychiatry at Washington University’s medical school who helped design Tessa, told NPR.



[ad_2]

Source link

Subscribe to our magazine

━ more like this

Crypto Crime Investigation (C.C.I) Enhances Singapore’s Safety with Innovative Pig Butchering Fraud Recovery Technology

Crypto Crime Investigation (C.C.I) is proud to announce the launch of its groundbreaking Pig Butchering fraud recovery technology, a vital initiative aimed at protecting...

U.S. Treasury removes Francisco Javier D’Agostino from sanctions list after independent review

The United States Treasury Department has removed Francisco Javier D'Agostino from its sanctions list following an independent review that confirmed his business activities were...

Expert Forensic Analysis in Investigating Crypto Investment Scams and Recovering Lost Funds

The allure of cryptocurrency investment, with its potential for high returns, has unfortunately attracted a darker side: sophisticated and deceptive scams. Victims of these...

Asia’s Certified Cryptocurrency Investigator Launches in Singapore: Pioneering Crypto Crime Investigation (C.C.I)

Singapore, – In a groundbreaking move to enhance digital asset security and bolster consumer confidence in the cryptocurrency market, the Crypto Crime  Investigation...

C.C.I Launches as the Ultimate Recovery Platform for Crypto Investors Targeted by Scams

Nevada, Florida – In response to the growing concern over cryptocurrency investment scams, C.C.I (Crypto Crime Investigation) proudly announces its official launch as the...