Facebook’s new tool uses machine learning to detect revenge porn

Share the joy
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  

Credit: https://newsroom.fb.com/news/2019/03/detecting-non-consensual-intimate-images/

Facebook has announced a new tool that will proactively detect revenge porn, the company has said. According to Facebook, the new tool can proactively detect and flag private images and videos of someone posted online without their consent. The tool has been designed to work both on Instagram and Facebook, and is capable of detecting “near-nude” content.

Users on both platform [Facebook and Instagram] would have to personally report such revenge porn, but the new system has been designed to support victims by flagging such images and videos when they are posted online.

By using machine learning and artificial intelligence, we can now proactively detect near nude images or videos that are shared without permission on Facebook and Instagram. This means we can find this content before anyone reports it, which is important for two reasons: often victims are afraid of retribution so they are reluctant to report the content themselves or are unaware the content has been shared,” Antigone Davis, Global Head of Safety at Facebook said in a blog post.

Facebook’s blog post did not give much away as to how it intends to use artificial intelligence to fight or detect revenge porn. However, The Associated Press reports that the new machine learning system being deployed goes beyond looking at images alone. According to the AP, the tool also looks at captions; and when they contain “derogatory or shaming text,” then it means the images or videos were probably posted to embarrass the original owners.

If this would be enough to curb or eliminate revenge porn is something to ponder about. However, the company deserves a part on the back for efforts being made to deal with the issue and protect the private images and videos of every user on its two platforms.

In 2017, Facebook teamed up with a small Australian Government agency, eSafety to bring an end to the sharing of nude pictures on its platform as well as its subsidiary companies without the consent of the owner. By collaborating with the eSafety office, victims of revenge porn could take back any image by storing it in a way that it will be difficult for anyone to upload it to Facebook, or Instagram or Messenger.

Users will be able to take action to stop your nude pictures from being shared by anyone before the act is committed, said e-Safety Commissioner Julie Inman Grant.

So, even if a hacker or your ex decides to share your intimate pictures for one reason or the other, the photos will never show up on Facebook or any of its subsidiary platform like Messenger and Instagram.


Share the joy
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  

Author: Ola Ric

Ola Ric is a professional tech writer. He has written and provided tons of published articles for professionals and private individuals. He is also a social commentator and analyst, with relevant experience in the use of social media services.

Share This Post On