Top legal and online safety experts said making abusive and exploitative content featuring minors widely available to Teleperformance and TikTok workers with lax safeguards runs counter to that mandate and could cross the line from a safety or privacy problem to a crime. But federal law explicitly requires companies to “minimize the number of employees that are provided access” to that content, “maintain the materials in a secure location,” and “ensure that any such visual depiction is permanently destroyed, upon a request from a law enforcement agency.” Once a company has reported this material to NCMEC, it’s statutorily granted immunity to retain it for 90 days to aid authorities. The material must immediately be taken down and reported to NCMEC’s CyberTipline, where staff then analyze the files, and work to track down where they came from, so they can alert the appropriate law enforcement agency. Images of child abuse and exploitation are illegal, and there are strict rules for handling them when discovered. “Free and loose is not going to work here” TikTok’s parent company ByteDance, purveyor of the Lark platform, did not respond to repeated requests for comment. TikTok declined to answer questions about how many employees have access to the DRR and where the document is hosted. TikTok spokesperson Jamie Favazza said that the company’s “training materials have strict access controls and do not include visual examples of CSAM,” but conceded that it works with third party firms who may have their own processes. And storing the images in a widely accessible document is reckless and unnecessary. They say showing sexual images of kids in content moderation training, censored or not, only re-victimizes them. What is unique, however, is the way TikTok and its outside consultants are handling this material-an approach experts say is ham-handed and cavalier at best, and harmful and re-traumatizing at worst. The most powerful social media platforms on the planet have long used machine learning and third-party human reviewers to catch and remove such content before it’s widely shared, and many companies work with the National Center for Missing & Exploited Children, or NCMEC, to alert law enforcement of such problematic imagery in their apps. TikTok is hardly alone in its struggle to purge child sexual abuse material. He declined to answer a detailed list of other questions regarding how many people have access to child sexual abuse material through the DRR and how Teleperformance safeguards this imagery. Teleperformance’s Global President of Trust & Safety Akash Pugalia told Forbes the company does not use videos featuring explicit content of child abuse in training, and said it does not store such material in its “calibration tools,” but would not clarify what those tools are or what they do. “If parents knew that, I'm pretty sure they would burn TikTok down.” And these parents don't know that we have this picture, this video, this trauma, this crime saved,” Whitney told Forbes. “I was moderating and thinking: This is someone's son. Whitney Turner, former moderator for TikTok If parents knew that, I'm pretty sure they would burn TikTok down. These parents don't know that we have this picture, this video, this trauma, this crime saved.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |