top of page

Musk said protecting children's safety is a top priority for Twitter

Twitter's new boss and chief executive, Elon Musk, declared that keeping children safe on the platform is a top priority. But after a wave of layoffs and resignations, the key team tasked with moderating child sexual abuse material (CSAM) on Twitter has just one permanent employee left, according to two people familiar with the matter.

It's unclear exactly how many people were on the team before Musk took over. On the professional networking site LinkedIn, Wired magazine identified four Singaporean employees who specialize in child safety. But they publicly said they had left in November.

The researchers say the importance of in-house child safety experts cannot be underestimated. Based in Singapore, the team enforces Twitter's CSAM ban across the Asia-Pacific region. Currently, only one full-time employee remains on the team. The Asia-Pacific region has a population of about 4.3 billion, accounting for about 60% of the world's total population.

The team in Singapore is responsible for Twitter's busiest markets, including Japan. According to data aggregator Statista, Twitter has 59 million users in Japan, second only to the United States. However, the Singapore office has also been affected by widespread layoffs and resignations since Musk took over the company. In the past month, Twitter fired most of its employees and then sent emails to the remaining employees, asking them to either commit to "igh-intensity, long-term work or accept three months' salary as severance pay.

Carolina Christofoletti, a CSAM researcher at the University of São Paulo in Brazil, said,

the impact of the wave of layoffs and resignations on Twitter's ability to address CSAM was very worrying. If those who work on child safety within Twitter are fired or resign en masse, it's absolutely wrong to think that Twitter won't be affected.

Twitter's child safety experts don't just review CSAM content on the platform. They have also been helped by groups such as the Internet Watch Foundation (IWF) in the UK and the National Center for Missing and Exploited Children (NCMEC) in the US, which scour the internet to identify images shared on platforms such as Twitter CSAM content.

The IWF said,

the data the agency sends to tech companies can be automatically deleted by the company's systems without human intervention.

Emma Hardy, IWF communications director said,

this ensures that the shielding process is as efficient as possible.

But those outside groups typically only focus on the final product and don't have access to Twitter's internal data, Christopher Letti said. She calls internal Twitter data the key to analyzing metadata that can help people who write detection code detect CSAM content before it's shared.

She said,

the only people who can see the metadata are people inside the platform.

Twitter has complicated the company's efforts to combat CSAM by allowing people to share adult content with mutual consent. Arda Gerkens, head of the Dutch foundation EOKM, said the platform's tools to scan for child abuse had difficulty distinguishing consenting adults from non-consenting children.

She said,

technology is not good enough right now, which is why human resources are so important.

Twitter was working hard on its website to block the spread of CSAM content before Musk took over. In its latest transparency report covering July to December 2021, Twitter said it banned more than 500,000 CSAM accounts, a 31% increase compared to the previous six months.


2 views0 comments

Recent Posts

See All
bottom of page