[Infowarrior] - Facebook If You Want to Use Facebook’s Revenge Porn Blocker, An Employee Will Have to Review Your Uncensored Photo

Richard Forno rforno at infowarrior.org
Wed Nov 8 19:11:14 CST 2017


No, this is not from The Onion.    ---rick


If You Want to Use Facebook’s Revenge Porn Blocker, An Employee Will Have to Review Your Uncensored Photo
https://gizmodo.com/if-you-want-to-use-facebook-s-revenge-porn-blocker-an-1820271537

Facebook announced on Tuesday that it is deploying a new revenge porn reporting tool, first piloted in Australia, that will allow users to send photos that they don’t want shared online directly to Facebook. Facebook says its community operations team will use a hash system to prevent the photo from being shared across Facebook, Instagram, or Messenger. But before an image is hashed, your intimate photos will be looked at by someone at Facebook.

A Facebook spokesperson confirmed to the Daily Beast today that a staffer will first have to look at the uncensored version of the image in order to make sure that the uploaded content fits the definition of revenge porn. What’s more, images will be blurred and stored by Facebook and “available to a small number of people,” according to the Daily Beast.

“The photo has to be examined by a human first to make sure it is actually objectionable per policy,” security researcher Nicholas Weaver told Gizmodo in an email. “Otherwise, someone could upload the famous ‘tank man’ photo, call it revenge porn, and censor it that way.” When asked how a photo will look to a Facebook employee once a user uploads it, Weaver said that in order to determine whether an image is objectionable, “it has to be clearly visible.” It is not until after it is determined so that it will become unidentifiable.

While giving users the power to get ahead of abusers by preemptively uploading any media they don’t want shared online isn’t inherently bad, requiring a stranger to look at the uncensored content leaves a lot of room for improvement. For one, Facebook’s system still requires you to put a ton of trust in an unknown Facebook employee or contractor, and not mind that they are looking at photos you’d otherwise not want to be seen.

This policy puts the responsibility on users to prevent anticipated abuse, and it also signals that Facebook doesn’t totally trust its beloved algorithms to do the job. The company has praised its use of machine learning as a way to deal with harassment, fake news, and more. But it still seemingly doesn’t have a sophisticated enough algorithm to determine whether a photo or video can be considered revenge porn.

We have reached out to Facebook for comment.



More information about the Infowarrior mailing list