This article is more than 1 year old

The NAKED truth: Why flashing us your nude pics is a good idea – by Facebook's safety boss

We can explain, insists multibillion-dollar social network

Poll Amid days of intense debate over about its controversial plan to block revenge porn on its social network, Facebook sought to calm fears about the program.

Antigone Davis, Facebook's global head of safety, on Thursday attempted to clarify details about the system, which is being tested right now in Australia, and is heading to the UK, America and Canada. In doing so she has made some good points regarding the approach taken, but also revealed some limitations.

Essentially, the system works like this: if you suspect someone has copies of photos of you naked and is going to leak them on Facebook, you can preemptively upload those snaps to a private chat area of the network where a trained staffer will verify the photo, and generate and store a digital signature of the image. Once that happens, the users should delete the image from the message thread, removing it from the network.

If any photos are subsequently posted on Facebook that match one of these signatures, it will be automatically blocked. Thus, if someone shares one of your previously submitted nude pics, the action will be halted before any damage is done.

JavaScript Disabled

Please Enable JavaScript to use this feature.

The one obvious drawback is that someone at Facebook will get to see you, albeit briefly, naked. And for obvious reasons, it's for adults only, not children. The internet went bananas at the news. So, Davis grabbed a keyboard, took a deep breath, and...

"With this new small pilot, we want to test an emergency option for people to provide a photo proactively to Facebook, so it never gets shared in the first place," she said. "This program is completely voluntary. It's a protective measure that can help prevent a much worse scenario where an image is shared more widely."

Davis explained the system applies across the tech giant's entire platform: Facebook, WhatsApp, and Instagram. The digital signature kept on file by Facebook is basically a cryptographic hash – a one-way encrypted summary of the image that can't be reversed to the original.

The scheme is for people who fear an ex-partner, an abusive lover or similar, may share their private intimate snaps publicly on Facebook – an act often described as revenge porn because these pictures are typically leaked to humiliate and harass victims. According to Davis, preemptively submitted images will be viewed by a "specially trained representative from our community operations team," to check the photos are actually legit nude snaps and not an attempt to censor other images from the platform.

management regulation1

Bloke, 36, in the cooler for leaking ex's topless pics on Facebook

READ MORE

If an ex, or similar scumbag, is threatening to leak private pics or blackmailing a person, a victim has a clear choice: either show one person in Facebook briefly the snaps, or spend days or months worrying about the whole world seeing them. In such cases, you can understand Facebook's approach, and we're assuming here that when a user deletes the submitted photos they truly are deleted forever. And that the reviewer is honest and doesn't keep a copy or otherwise exploits their rather powerful position.

However, there are a couple of little things potentially ruining the party.

First off, chances are that the victim doesn't have a copy of the nude photograph that someone else has taken. If it's not a selfie, such as someone taking a photo of themselves in the mirror, then it's likely been snapped by their partner. The victim might also have deleted the image after sharing it with someone who then kept a copy. In such cases, the victim is out of luck.

Even if the victim still has a copy, the hashing system may be easy to circumvent. If the image has been altered enough – such as rotated or cropped or flipped or had some awful words scribbled across it – shared revenge porn may bypass the filters, and the victim will have submitted their pictures in vain. It is possible to generate a robust set of signatures for each image: each picture could be reduced to a basic low color, low resolution form with the center area of the frame hashed to potentially defeat a miscreant's attempts to evade the filters by cropping, defacing or color washing the snap. A set of signatures could be generated for each combination of flipping the photo along its X and Y axises and for every degree of rotation, all 360 of them or every five degrees or whatever.

This would create a collection of digital fingerprints per photo, which subsequently uploaded snaps need to be run by. At least a quarter of a million pictures are posted on Facebook every minute, on average, so that's a lot of signature checks. We hope it scales. We hope Facebook is using robust signatures, and not flimsy hashes that are useless against basic image-editing tools.

And since Facebook's face-recognition code is rather fantastic, in addition to hashing, the biz could apply this machine-learning tech to identify any photos of those who have preemptively submitted nude snaps, and apply greater scrutiny to any posted pics matching their face to ensure the images are not of the revenge porn variety. It could also incorporate Photo DNA, the Microsoft-powered system Facebook uses to detect child sex abuse material on the network, but instead obviously focus it on revenge porn pics.

Facebook's approach, of course, won’t stop people posting the pictures to image-hosting sites, like Imgur, and then posting a link on Facebook.

To give its approach a bit of welly, Facebook rolled out quotes from people it has been working with on the pilot Down Under. Not surprisingly, they are all very supportive of Zuck & co's design.

"If you've never tried to end a relationship with an abusive, controlling, and violent partner, there is no way you'd understand the very real terror victims feel of how much damage an abuser can and will do by sharing intimate images," said Cindy Southworth, founder of the Safety Net Technology Project. "This voluntary option provides another tool to victims to prevent harm."

Facebook's chief security officer Alex Stamos has also taken to Twitter to support the system. While he has been taking some flak for it, he has also answered the key question of why not let people do the hashing themselves in a device-side app:

While Facebook's scheme is in some ways flawed, it may help a few people out. Frankly, given the damage revenge porn can do to people's lives, that's no bad thing, but the whole scheme seems overly optimistic.

Facebook is doing its best, but this kind of thing is a shit sandwich all around. The network is damned if it tries and damned if it doesn't do anything. One solution is to stop taking naked photos of each other, or sharing intimate snaps. The best solution would be if abusive scumbags could stop being so awful. ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like