TikTok moderators say they were shown videos of child sexual abuse during training

a Forbes The report raises questions about how TikTok’s moderation team handles child sexual abuse content – alleging that it provided widespread, unprotected access to illegal photos and videos.

Employees of a third-party moderation organization called Teleperformance, which works with TikTok among other companies, claim that it asked them to review a disturbing spreadsheet called the DRR or Daily Required Reading by TikTok Moderation Standards. But it has been said. The spreadsheet allegedly contained content that violated TikTok’s guidelines, including “hundreds of images” of naked or abused children. Employees say hundreds of people on TikTok and Teleperformance could access content from both inside and outside the office – opening the door to a wider leak.

teleperformance refused Forbes that it showed employees sexually exploitative material, and TikTok stated that its training materials “have strict access controls and do not include visual examples of CSAM,” although it does not confirm that all third-party vendors meet that standard. complete.

Employees tell a different story, and as Forbes Lets out, it’s legally dice. Content moderators are regularly forced to deal with CSAM which is posted on many social media platforms. But pictures of child abuse are illegal in the US and should be handled with care. Companies are expected to report content to the National Center for Missing and Exploited Children (NCMEC), then preserve it for 90 days, but minimize the number of people viewing it.

The allegations here go far beyond that limit. They indicate that Teleperformance showed graphic photos and videos of employees being tagged on TikTok as an example, while playing fast and loose with access to that content. One employee says he contacted the FBI to ask whether the practice criminally spurred CSAM, though it’s unclear whether it was uncovered.

Complete Forbes The report is well worth a read, outlining a situation where moderators were unable to keep up with TikTok’s explosive growth and asking them to look into crimes against children for reasons they didn’t add. Even by the complex standards of the online debate about child safety, this is a strange – and if accurate, frightening – situation.

Source link

Leave a Comment