The popular app TikTok removed 81,518,334 videos from April to June for violating community guidelines

We’ve all had it happen, seemingly for no reason—we post a video on TikTok, and then, mysteriously, it is taken down, for ‘violating community guidelines’.

What is happening? Why is TikTok removing our videos?

TikTok today released a statement on the removal of their videos, announcing they removed a staggering [deep breath] 81,518,334 from the platform from April to June only, less than 1 percent of all content.

Interestingly, 94.1 percent were removed BEFORE being reported by a user—so most likely, you have no haters to blame but yourself.

In a new report, TikTok detailed that the content violated community guidelines. Why is it so stringent? To “reinforce the platform’s public accountability to the community, policymakers, and NGOS,” says TikTok.

So what kind of content is being removed?

Out of the 81 million, 73.3 percent of content promoting harassment and bullying videos and 72.9 percent focused on hateful behavior videos were removed prior to being reported, so harassment remains the biggest offender.

TikTok is getting more stringent about removing this content, 66.2 percent and 67 percent respectively, removed in the first quarter of this year under the same metrics.

“The improvement stems from the pioneering combination of technology and content moderation by a dedicated investigations team used to identify videos that violate policies. To better enforce these policies, moderators also receive regular training to identify content featuring reappropriation, slurs and bullying,” said a statement from TikTok.

Quite simply, TikTok needs to exist on the global stage, so any hint of a violation, whether correct or incorrect, will result in the immediate removal.

If you think it was a mistake, of course, you can appeal the removal, or simply delete and repost if it was a mistake.

Of those deleted, TikTok added that 93 percent were removed within 24 hours of posting, and 87.5 percent of removed content had zero views.

The platform also announced improved mute settings for comments and questions during live streams, whereby hosts can temporarily mute select viewers for anywhere between a few seconds to the entire duration of the LIVE. Once muted, the user’s entire comment history will also be removed, in addition to the existing option to turn off comments or limit potentially harmful comments using a keyword filter.

 “With our recent milestone of one billion monthly users, it is evident that people come to TikTok to express themselves creatively, connect, discover new sources of entertainment and explore interests. In order to promote a safe and welcoming environment, we continuously strive to uphold our Community Guidelines, make improvements to our detection mechanisms and champion positive, inspiring content. Our efforts to get these critical issues right for our community are ongoing, and we aim for these new controls to further empower hosts and audiences alike to enjoy a safe and entertaining experience on TikTok,” said Hany Kamel, Content Operations Director at TikTok MENA.

TikTok also allows users to customize their experience with a range of tools and resources, including effective ways to filter comments on their content, delete or report multiple comments at once, and block accounts in bulk. To stop this kind of content, prompts have been introduced to urge users to consider the impact of their words before posting unkind or violative comments.

 “This has already proven effective with nearly four in 10 people withdrawing or editing their comments,” said TikTok.

So if you’re wondering why your videos keep being deleted, perhaps read the guidelines a bit more closely. They’re here.

RELATED CONTENT