“Our goal at TikTok is to foster a safe and supportive environment for our community, and there’s no place for this kind of malicious behaviour or content on our platform. We’re proud to partner with StopNCII.org to strengthen efforts to stop the spread of non-consensual intimate imagery and better support victims.” — Julie de Bailliencourt, Head of Product Policy, TikTok. This move is a way for TikTok to show that it has been taking steps to deal with online harm when it’s under increasing regulatory scrutiny. The UK is planning to introduce new rules under the Online Safety Bill, which would require platforms that host user-generated content to prioritize the removal of revenge porn or face stiff penalties. Australia implemented similar legislation requiring social media apps to take down reported cases within 24 hours. “We now have four platforms, but we need thousands,” said SWGfL’s chief executive officer, David Wright. “The more we can get ingesting the hashes, the more we can reduce the threat and fear victims experience.” This approach by Australia and the UK creates additional liability for social media platforms, which led Meta and StopNCII.org to come up with a tool to combat the spread of revenge porn.
How does StopNCII.org work?
Motivated harassers might try to upload someone’s intimate photos on several platforms to humiliate, manipulate, or extort them. Therefore, to combat this issue, Meta developed a tool to convert images into a digital fingerprint called a hash. Working on that, StopNCII.org allows people threatened with intimate image abuse to create unique identifiers for their images. When a case is opened, hashes are represented as a string of letters and numbers rather than the image itself, and the entire process takes place on their device to protect the user’s privacy. Therefore, if an image or video uploaded to TikTok, Bumble, Facebook, or Instagram matches a corresponding hash and “meets partner policy requirements,” then the platform’s moderation team will look into it. The moderators will then remove the image if it breaks the platform’s rules.