Tech

Members of Twitter’s Trust and Safety Council Not Sure Elon Musk Knows They Exist

​A silhouette in front of a bunch of Twitter logos.

Members of Twitter’s Trust and Safety Council—a group of 100 organizations working on issues including harassment, content moderation, and suicide prevention on the platform—say that they’re unsure about their future, and if Elon Musk, who took over Twitter last week, even knows they exist.

“Now I feel like we’re in a different universe,” Danielle Citron, vice president at the Cyber Civil Rights Initiative, told Motherboard. Citron said that despite one of the council’s regular meetings being on the calendar, her organization hasn’t heard from Twitter and Twitter staff seems to be “ghosting” them on updates.

Videos by VICE

Bloomberg reported on Monday that most people who work in Twitter’s Trust and Safety organization are locked out of their access to internal tools used for content moderation, and “are currently unable to alter or penalize accounts that break rules around misleading information, offensive posts, and hate speech,” citing anonymous sources familiar with the matter. Musk’s first act as new owner of Twitter was firing its top executives, including CEO Parag Agrawal, CFO Ned Segal, policy executive Vijaya Gadde, and company general counsel Sean Edgett. Vijaya worked closely with the council, according to Citron.

Musk has said that he wants to form his own content moderation council, with “widely diverse viewpoints.” Musk tweeted last week that “no major content decisions or account reinstatements will happen before that council convenes.” 

On Wednesday, he tweeted that he’d talked to people at the Anti-Defamation League, Color of Change, and the NAACP, among others, about “how Twitter will continue to combat hate & harassment & enforce its election integrity policies.” 

After Musk’s takeover of Twitter, the platform saw a surge of hate speech, according to Twitter’s head of safety and integrity, Yoel Roth.

Where all of this leaves the existing Trust and Safety Council is unclear. Twitter did not respond to a request for comment about the status of the council. 

“I sadly am not sure Elon Musk knows about the existence of the [Trust and Safety] council as of yet,” Alex Holmes, deputy CEO at the Diana Award Anti-Bullying Campaign and a member of the council, told Motherboard. “The Twitter Trust and Safety Council is a dedicated and passionate global group made up of unpaid representatives from NGOs, safety, hate speech and free speech experts who are there to be critical friends. We have often given our advice on upcoming products/tools, updates, safety issues. We are not an oversight board, and not involved in any moderation decisions, instead supporting a safe and healthy platform which is inclusive of all.” 

Twitter formed the Trust and Safety Council in 2016 as “a new and foundational part of our strategy to ensure that people feel safe expressing themselves on Twitter,” according to its announcement—with more than 40 organizations and experts from 13 regions making up its inaugural members. The council held its first annual summit the following year at Twitter’s San Francisco headquarters, where then-CEO Jack Dorsey participated and heard presentations from members. There are currently 100 organizations representing five different focus areas—content governance, suicide prevention, child sexual exploitation, online safety and harassment, and digital and human rights—listed on the council’s website.

“I felt very plugged-in, like I could always go to Vijaya,” Citron said. “It felt really responsive.”

Emma Llansó, director for the Center for Democracy and Technology’s Free Expression Project, and a member of the council, told Motherboard that her organization hasn’t heard anything from Twitter since late September.

“From my experience, the Council members are all really dedicated to trying to help Twitter be more responsive to abuse and more transparent and fair in how they enforce their policies,” Llansó said. “There’s still a long way to go, but Twitter staff have made a continual effort to improve the experiences of its most vulnerable users. It’s hard to tell exactly what Musk’s plans are for trust and safety work at Twitter, but it’s disconcerting that he talks about taking the company in a different direction.” 

Before his takeover of the company, Musk frequently complained about what he saw as the platform’s lack of “free speech,” but he has only defined his vision of free speech as “that which matches the law,” in a tweet in April. 

“I don’t think he’s about free speech; I think he’s about ‘free speech that I like,’” Citron said.

Twitter has always had major flaws in how it’s handled issues of privacy, safety, and user trust. It’s been widely criticized as reluctant to solve the issues of hate speech and trolls, while rolling out features no one asked for. The council itself accused Twitter of not listening or being responsive enough in 2019, in a letter to Dorsey obtained by Wired. But even with its existing issues, disbanding a group that’s been doing years of work in safety at a critical moment in the platform’s history would be a mistake, members of the council say. 

“It would be a shame to see the work and passion of this global group disbanded, and I am hopeful there is a way to continue to work with Twitter under new direction,” Holmes said.

“If Twitter dissolves the Council, I worry that could signal a retrenchment by Twitter, as far as seeking outside expertise, and a decision to deprioritize crucial trust and safety work,” Llansó said. “Twitter needs to have some process for engaging outside experts and perspectives in order to better inform its work.”