TikTok put user safety as paramount on their platform. Due to this they have been testing tools which can identify content which violates their policies.
These tools are now being implemented in the US and Canada. All content posted to TikTok initially passes through a tool which flags potential policy violations to be reviewed by a safety team member. If a violation is confirmed the user is notified and the video removed, if the content is fine then the video will be posted.
TikTok is now implementing automated tools for some content categories. This will only be in categories where the automation has the highest degree of accuracy. This includes content which may include violations of policies such as minor safety, adult nudity or violent content. Creators will still be able to appeal decisions directly in the app.
This new tool is designed to keep platform users safe. But it’s also designed to reduce the volume of distressing videos which moderators are having to review. It will also enable them to spend more time reviewing more nuanced policy violations including misinformation or bullying.