Facebook has begun conducting a pilot where it solicits intimate photographs of women – and it will soon offer the service in the United Kingdom as well as the US. Anxious exes who fear their former partner is set on revenge porn will be urged to upload photographs of themselves nude.
A hash of the nude image is created and passed along to Facebook's AI image-matching systems. Subsequent unauthorised attempts to post on one of Facebook's services (including Instagram), the image are then blocked.
The story originated with The Australian Financial Review. Australia's safety commish Julie Inman Grant defended the experiment, arguing "they're not storing the image".
Facebook introduced a reporting option for victims of revenge porn in April that worked across Facebook, Messenger and Instagram. The back-end infrastructure created a hash and prevented repeat publications. This is an extension of that service.
But what an extension.
What's astonishing is that Facebook is soliciting "intimate images" (in its own words) before a crime has occurred. Inman Grant sees no problem with this. "This lets the victim take control and be proactive in their own safety," she told AFR.
But you'd need to trust Facebook a great deal to think that's part of a solution. Perhaps Zuck has been reading The Circle, in which if you don't trust the Zuck-like figure, you're a menace to society.
Clarifying Facebook's revenge porn pilot after speaking to them:— Joseph Cox (@josephfcox) November 8, 2017
- A Facebook worker will see the full, uncensored nude images
- Images stored for a period
- But how else could you do it; it's about riskhttps://t.co/t3FaaxxR2T
The nude upload option will come to the UK, Canada and America, the Washington Post reports. ®