MENLO PARK (CBS SF) — Facebook unveiled new technology Friday, using artificial intelligence to detect and remove revenge porn — intimate pictures loaded onto social media without the victim’s knowledge.

In a release, Antigone Davis, Global Head of Safety for Facebook, said the social media giant has always responded to requests to remove intimate images but with the new software can detect it before a complaint is filed.

“Finding these images goes beyond detecting nudity on our platforms,” Davis wrote. “By using machine learning and artificial intelligence, we can now proactively detect near nude images or videos that are shared without permission on Facebook and Instagram. This means we can find this content before anyone reports it.”

He said the software upgrade was important because “often victims are afraid of retribution so they are reluctant to report the content themselves or are unaware the content has been shared.”

Once detected, the photos will be reviewed by a specially-trained member of our Community Operations team.

“If the image or video violates our Community Standards, we will remove it, and in most cases we will also disable an account for sharing intimate content without permission,” Davis wrote. “We offer an appeals process if someone believes we’ve made a mistake.”

Facebook is also launching a revenge porn victim-support hub called — Not Without My Consent.

“Victims can find organizations and resources to support them, including steps they can take to remove the content from our platform and prevent it from being shared further,” Davis wrote.

Many states have also taken steps to prevent revenge porn social media posts. At least 42 states have passed laws criminalizing revenge porn, many in the past five years.

Comments