The growing scourge of “revenge porn,” which involves the distribution of sexually explicit material without the consent of all parties concerned, has prompted a number of countries to legislate against such activities. Thirty-five U.S. states now have revenge porn laws, as do the U.K., Germany, and Israel.
While laws play an important part in curbing revenge porn attacks, preventing the distribution of such material is obviously the first priority, which is why technology companies have been increasingly working on new tools to help put a stop to the practice. Google has rules and processes in place to remove revenge porn imagery, as do Microsoft and blogging platform Medium, among others.
Facebook is known to have been working on specific revenge porn tools for a while, and the company updated its community standards guidelines back in 2012 to address revenge porn specifically. But today Facebook has offered its first glimpse of new tools to combat the sharing of explicit personal imagery, features that were developed “in partnership with safety experts.”
Moving forward, if you see an “intimate” photo shared on Facebook, Messenger, or Instagram that appears as though it may have been uploaded without permission, you can hit a little Report button within the downward arrow next to the post.
June 5th: The AI Audit in NYC
Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.
“Specially trained representatives from our Community Operations team review the image and remove it if it violates our Community Standards,” explained Antigone Davis, Facebook’s head of global safety, in a blog post announcing the new tools. “In most cases, we will also disable the account for sharing intimate images without permission. We offer an appeals process if someone believes an image was taken down in error.”
Facebook has courted controversy in the past for the way its algorithms decide what content is pushed to the masses, so it’s heartening to know that in revenge porn instances the company will be using humans to assess the situation. However, it has also landed in trouble when using human editors, perhaps most notably when it removed the Pulitzer-Prize winning “napalm girl” picture and banned the writer who shared it.
Facebook says it won’t be relying exclusively on humans when revenge porn reports are filed. Indeed, once it has been decided that a photo shouldn’t be shared across the company’s various platforms, Facebook will use photo-matching smarts to prevent the image from being reposted. “If someone tries to share the image after it’s been reported and removed, we will alert them that it violates our policies and that we have stopped their attempt to share it,” added Davis.
Additionally, Facebook says that it’s teaming up with a number of safety organizations that will offer extra resources and support to revenge porn victims.
Today’s launch comes just a few months after Facebook lost a bid to prevent a lawsuit from a 14-year old girl whose nude photos appeared on the social network.