Connect with us

Social Networking

Twitter Said to Have Removed Suicide Prevention Helpline, Other Safety Features on Elon Musk’s Order

Avatar

Published

on

Elon Musk has said that impressions of harmful content are declining since he took over in October.
By Reuters | Updated: 24 December 2022

Twitter removed a feature in the past few days that promoted suicide prevention hotlines and other safety resources to users looking up certain content, according to two people familiar with the matter who said it was ordered by new owner Elon Musk.

The removal of the feature, known as #ThereIsHelp, has not been previously reported. It had shown at the top of specific searches contacts for support organizations in many countries related to mental health, HIV, vaccines, child sexual exploitation, COVID-19, gender-based violence, natural disasters and freedom of expression.

Its elimination could add to concerns about the well-being of vulnerable users on Twitter. Musk has said that impressions, or views, of harmful content are declining since he took over in October and has tweeted graphs showing a downward trend, even as researchers and civil rights groups have tracked an increase in tweets with racial slurs and other hateful content.

Twitter and Musk did not respond to requests for comment on the removal of the feature.

Washington-based AIDS United, which was promoted in #ThereIsHelp, and iLaw, a Thai group mentioned for freedom of expression support, both told Reuters on Friday that the disappearance of the feature was a surprise to them.

AIDS United said a webpage that the Twitter feature linked to attracted about 70 views a day until December 18. Since then, it has drawn 14 views in total.

Damar Juniarto, executive director at Twitter partner Southeast Asia Freedom of Expression Network, tweeted on Friday about the missing feature and said “stupid actions” by the social media service could lead his organization to abandon it.

Reuters could not immediately establish why Musk would order the removal of the feature. The sources with knowledge of his decision declined to be named because they feared retaliation. One of them said millions of people had encountered #ThereIsHelp messages.

Eirliani Abdul Rahman, who had been on a recently dissolved Twitter content advisory group, said the disappearance of #ThereIsHelp was “extremely disconcerting and profoundly disturbing.”

Even if it was only temporarily removed to make way for improvements, “normally you would be working on it in parallel, not removing it,” she said.

In part due to pressure from consumer safety groups, internet services including Twitter, Google and Facebook have for years tried to direct users to well-known resource providers such as government hotlines when they suspect someone may be in danger.

Twitter had launched some prompts about five years ago and some had been available in over 30 countries, according to company tweets. In one of its blog posts about the feature, Twitter had said it had responsibility to ensure users could “access and receive support on our service when they need it most.”

Just as Musk bought the company, the feature was expanded to show information related to natural disaster searches in Indonesia and Malaysia.

Alex Goldenberg, lead intelligence analyst at the non-profit Network Contagion Research Institute, said prompts that had shown in search results just days ago were no longer visible by Thursday.

He and colleagues in August published a study showing that monthly mentions on Twitter of some terms associated with self-harm increased by over 500 percent from about the year before, with younger users particularly at risk when seeing such content.

“If this decision is emblematic of a policy change that they no longer take these issues seriously, that’s extraordinarily dangerous,” Goldenberg said. “It runs counter Musk’s previous commitments to prioritize child safety.”

Musk has said he wants to combat child porn on Twitter and has criticized the previous ownership’s handling of the issue. But he has cut large portions of the teams involved in dealing with potentially objectionable material.

© Thomson Reuters 2022