TikTok to warn users of unverified content before sharing
TikTok will be rolling out a new feature that fact-checks content and warns users of unverified content before they share.
Prior to this feature, TikTok has already partnered with fact-checkers at PolitiFact, Lead Stories, and SciVerify to help the platform assess the accuracy of the content. If fact checks confirm the content to be false, TikTok will remove the video. However, there are times when fact checks are inconclusive or cannot be confirmed, especially during unfolding events. In this case, TikTok will inform viewers that the video has unsubstantiated content in an effort to reduce sharing.
Here’s how it works: First, a viewer will see a banner on a video if the content has been reviewed but cannot be conclusively validated. Then the video’s creator will also be notified that their video was flagged as unsubstantiated content. If a viewer attempts to share the flagged video, they’ll see a prompt reminding them that the video has been flagged as unverified content. This additional step requires a pause for people to consider their next move before they choose to “cancel” or “share anyway.”
This feature will be rolling out globally over the coming weeks, starting today in the US and Canada.