TikTok to tackle spread of misinformation with new warning prompts

Share the joy

Image Credit: TikTok

TikTok is adding a new feature to its platform to boost its efforts aimed at curbing the spread of misinformation. Going forward, users will be pre-alerted alerted prior to sharing a video with information that has not been verified. This is crucial to helping people access information that has been verified only by fact checkers. Claims that have not been verified by fact checkers will most likely get the warnings before they are allowed to be published on the platform.

“Sometimes fact checks are inconclusive or content is not able to be confirmed, especially during unfolding events. In these cases, a video may become ineligible for recommendation into anyone’s For You feed to limit the spread of potentially misleading information,” TikTok’s Product Manager, Trust & Safety Gina Hernandez wrote in a blog post.

Here is how it works:

  • First, a viewer will see a banner on a video if the content has been reviewed but cannot be conclusively validated.
  • The video’s creator will also be notified that their video was flagged as unsubstantiated content.
  • If a viewer attempts to share the flagged video, he will see a prompt reminding him that the video has been flagged as unverified content. This additional step requires a pause for him to consider his next move before he chooses to “cancel” or “share anyway.”

TikTok said when it tested the feature, the rate at which viewers shared videos dropped by 24%, while likes on such unsubstantiated content also dropped by 7%. The feature, which is expected to roll out in the coming weeks, will first be made available to users in the US and Canada. As a matter of fact, users in the US and Canada have already started having access to the new feature, which could help curb the spread of misinformation online.

Last October, TikTok rolled out a new notification system that offers more clarity when it comes to content removals. The new system informs you which of TikTok’s policies you violated or why your content was removed. You will also be offered the chance to appeal the decision, which of course is a relief.

When your video is removed, you will be notified in-app with the date of the post as well as the specific policy that was violated. The company will also provide you with a link to the said policy that was violated.

For self-harm and suicide related videos, TikTok will direct you to expert resources through an additional notification including befrienders.org. Recall that the video-sharing app was banned by the Pakistan Telecommunication Authority following several complaints it received from “different segments of the society” alleging that TikTok encourages the sharing of “immoral/indecent content.” No thanks to the ban, the app was not accessible to millions of people for 10 days until the ban was lifted during the week.

Share the joy

Author: Ola Ric

Ola Ric is a professional tech writer. He has written and provided tons of published articles for professionals and private individuals. He is also a social commentator and analyst, with relevant experience in the use of social media services.

Share This Post On