Facebook has announced a new process which will help stop the spread of offensive content across their platforms. And while the main target of the new initiative is to stamp out revenge porn, it could have wider implications for further content violations also.
As noted in their announcement, and by Facebook CEO Mark Zuckerberg, revenge porn is their key focus here.
The new process will work like this:
- When a user sees an "intimate image on Facebook that looks like it was shared without permission", they can report it by using the "Report" link that appears when you tap on the downward arrow next to a post.
- Representatives from Facebook's Community Operations team will review the image and remove it if it violates the platform's Community Standards. Facebook says that in most cases, they'll also disable the account for sharing intimate images without permission.
- Facebook will then use photo-matching technology to stop the offending content from being shared any further across Facebook, Messenger and Instagram. "If someone tries to share the image after it's been reported and removed, we will alert them that it violates our policies and that we have stopped their attempt to share it."
It's the latest advance in Facebook's automated image-recognition efforts, which Zuckerberg highlighted in his recent outline of their key community initiatives moving forward, published back in February. At that time, Zuckerberg noted that their artificial intelligence and image recognition tools already generate around a third of all reports to the content review team.
Facebook's been developing their image recognition technology for years - you can see for yourself how far their systems have evolved by checking out the tags included on every image post (in the top right of the below image).
This new application shows that they're now able to put more trust in the system, and as noted, while revenge porn is the key focus (and an important one - one in 25 people in the U.S. have been the victims of non-consensual image sharing), the applications could be extended further to eliminate all reported offensive content from their platforms. If the system is able to accurately and reliably identify duplicate instances of any image, the potential use-cases are significant.
The next phase of this will be video recognition, which Facebook is also developing.
Such advances have many potential use cases, but community safety is a key area of development - one which could lead to us having a much more secure, inclusive space, free of illegal and offensive material.