TikTok Deletes Over 500K Kenyan Videos for Violating Safety Rules

TikTok has deleted more than 592,000 videos from Kenyan users between April and June 2025 for violating its community guidelines, as the platform rolls out new global rules tightening oversight on livestreams, monetization, and AI-driven content.

According to the company’s Q2 2025 Community Guidelines Enforcement Report, the removals were part of a broader crackdown that saw over 189 million videos pulled down globally—equivalent to 0.7 percent of all uploads on the app.

The platform also removed 76.9 million fake accounts and 25.9 million suspected underage accounts during the same period.

In Kenya, TikTok said 92.9 percent of the flagged videos were deleted before being viewed even once, while 96.3 percent were taken down within 24 hours of posting.

The figures mark one of the highest proactive enforcement rates in Africa and reflect growing investment in moderation technology and human review teams across the continent.

The report comes as TikTok introduces tougher livestream and monetisation rules aimed at curbing misuse, misinformation, and harmful behaviour in real-time broadcasts.

Among the new rules, only users aged 18 and above can host a live session or receive virtual gifts, while under-16 accounts remain barred from livestreaming altogether.

Additionally, creators must disclose when content is branded or promotional and are forbidden from pressuring viewers for digital gifts or directing traffic to external platforms during a broadcast.

TikTok has also moved to regulate the use of AI-assisted features such as auto-captions and real-time translations. Livestream hosts are now responsible for monitoring these tools and ensuring no misleading or harmful material is generated.

The new livestream policy also impacts content monetisation. By tightening rules around gift solicitation and brand disclosure, TikTok hopes to limit exploitation and improve transparency in its creator economy.

Globally, TikTok said 97.9 percent of all videos removed for policy violations were taken down before any user reported them, underscoring its growing reliance on automated detection. In Africa, Kenya ranked among the top five countries for content moderation activity, alongside Nigeria, South Africa, Egypt, and Ghana.

The company noted that its focus for 2025 remains “building trust through transparency” and announced plans to expand its Transparency and Accountability Centres — facilities where regulators, journalists, and researchers can observe moderation practices firsthand.

The latest enforcement report follows heightened scrutiny from governments worldwide, including Kenya, where policymakers have been debating potential regulation of social media platforms for youth safety and digital taxes.