There seems to be a sustained approach towards making Instagram and other social media platforms safe for all users these days. Instagram in particular has been updating its platforms with different tools to help every user stay safe. A new feature is now being tested to give everyone the opportunity to report misleading information within the platform.
The new reporting tool according to Matt Navarra is currently being tested, and would help the Facebook-owned app fight against misinformation which is fast becoming a menace.
In the example screenshot shared by Navarra, a new option is being added by the ephemeral photo sharing app that will allow you to report an Instagram post. To get started, simply tap on the three dots at the top right of the post you want to report, select “Report.” There are two options to choose from including— “It’s Spam” or It’s Inappropriate.” Selecting the “Inappropriate” option will take you to a separate reporting flow where “False Information” will now be added as an option.
— Matt Navarra (@MattNavarra) August 15, 2019
The two aforementioned options are currently only accessible to a select few since it is still being tested. So, if you are unable to see any of the options explained above, then you are probably not a part of the current test.
“From the end of August, people will be able to let us know about content that people believe may be misinformation and help improve our ability to proactively catch misinformation. When we find misinformation on Instagram, we filter it from places where people discover new content – Explore and hashtag pages,” a snapshot of Instagram’s statement as captured by Navarra.
As part of measures the company is making towards keeping the platform safe and helping users abide by its policies, the social media behemoth had last month announced that it would start alerting users to when their accounts are close to being closed. The new notification process according to Instagram, will help you to understand if your account is at risk of being disabled.
“We are now rolling out a new policy where, in addition to removing accounts with a certain percentage of violating content, we will also remove accounts with a certain number of violations within a window of time. Similarly, to how policies are enforced on Facebook, this change will allow us to enforce our policies more consistently and hold people accountable for what they post on Instagram,” Instagram said.
Added to that, notification will also grant you the opportunity to appeal deleted content. For a start, you will be able to appeal content deleted for violations of nudity, pornography, bullying and harassment, hate speech, drug sales, and counter-terrorism policies. The Facebook subsidiary said it will be expanding appeals in the coming months to cover more topics.