The world’s leading short-form video platform, TikTok, has released its Community Guidelines Enforcement Report, which details the volume and nature of violative content and accounts removed from the platform in Q3 of 2021. The report provides insight into content removed for violating the Community Guidelines, reinforcing the platform’s public-accountability, to the community, policymakers and NGOs. To protect the safety of the community and the integrity of the platform, 91,445,802 videos were removed globally, between 1st July and 30th September 2021, comprising around 1 percent of all videos uploaded. Nearly 95 percent of those videos were removed before a user reported it, while 88 percent before the video received any views and 93 percent were removed within 24 hours of being posted. With 6,019,754 videos removed, Pakistan is now ranked fourth in the world for the largest volume of videos taken down for Community Guidelines violations in Q3 2021. In addition, 73.9 percent of content promoting harassment and bullying were proactively removed, while 72.4 percent of hateful-behavior videos were also removed before anyone reported them. TikTok has announced updates to its Community Guidelines, to further support the well-being of its community and the integrity of the platform. These updates clarify or expand upon the types of behavior and content that will be removed from the platform or made ineligible for recommendation in the ‘For You’ feed. Over the coming weeks, every TikTok member will be prompted to read the updated guidelines when they open the application. To protect the security, availability, and reliability of the platform, TikTok is expanding its policy to include prohibition of unauthorized access to the platform, as well as TikTok content, accounts, systems, or data. Use of TikTok to perpetrate criminal activity is also prohibited. In addition to educating the community on ways to spot, avoid, and report suspicious activity, the platform is opening state-of-the-art cyber-incident monitoring and investigative-response centers in Washington DC, Dublin and Singapore this year. TikTok continues to expand its system that detects and removes certain categories of violations during upload – including adult-nudity and sexual-activities, child-safety, illegal-activities and regulated-goods. As a result, the volume of automated removals has increased, which improves the overall safety of TikTok and enables the team to focus more on reviewing contextual or nuanced content, such as hate-speech, bullying, harassment and misinformation, to improve the efficacy, speed, and consistency of the TikTok platform. The improvement stems from the pioneering combination of technology and content-moderation, by a dedicated investigations-team, deployed to identify videos that violate policies. To better enforce these policies, moderators also receive regular training, enabling them to identify content that features; reappropriation, slurs and bullying. TikTok’s Community Guidelines apply to everyone and all content on the platform, to achieve a safer standard of content that is appropriate for the general audience, which includes everyone from teens to great-great-grandparents.