Facebook has settled a case with a 14-year-old girl after the social network hosted revealing pictures of her on a Facebook "shame" page.
In perhaps the first case of its kind in the world, Facebook was taken to the High Court by lawyers for a Northern Irish girl, whose nude picture was repeatedly posted to the page between November 2014 and January 2016.
Although several hundred prosecutions by the CPS have resulted from changes in the law intended to discourage "revenge porn", this was a civil case brought under the Data Protection Act. She had been seeking damages for misuse of private information, negligence and breach of the DPA. Unusually, the victim had been able to find Legal Aid.
Her lawyers argued Facebook could and should have done more. Facebook contested the claim, arguing it had fulfilled its legal obligations by removing the photographs when it had been requested to do so. Facebook tried to get the case dismissed in September 2016, but failed to do so.
Last week, an out-of-court agreement was reached to compensate the victim - who cannot be identified - and pay her legal costs in a confidential settlement.
"The case had a very detrimental effect on (the victim's) mental health. That is why her family decided to seek legal redress," her lawyer, Pearse MacDermott, told the Press Association. "Whenever an image is put up that is clearly objectionable they should be able to stop that ever going up again. They should use the technology they have to be a responsible provider and remove the offensive post."
Facebook has since said that it has such an ability, and is trying out a scheme in Australia where those who feel they could be at risk from revenge porn can send in photos to be watermarked and automatically banned from all Facebook properties. Why the same technology wasn't used in this case isn't known thanks to the settlement deal.
Expert forensics witness Peter Sommer, a professor of Digital Forensics at Birmingham City University, was called in by the prosecution's lawyers to examine Facebook's "technical and human monitoring" responses. "I helped formulate a series of requests for disclosure about [Facebook's] technologies, human monitors and relationships with NCA/CEOP, PSNI, NCMEC and IWF," Sommer explained on LinkedIn.
"Facebook [has] now chosen to settle - which is very good news for the victim. But I am sad that we weren't able to find out more about Facebook's filtering and monitoring facilities (or possible lack thereof)," Sommer said.
Child abuse image hash list shared with major web firmsREAD MORE
MacDermott criticised the police for their initially tardy response to the issue, claiming that by dragging their feet they lost their chance to discover who had been uploading the pictures. Apparently one suspect wasn't questioned immediately and MacDermott suggested this was where the local cops missed their chance.
"Had they gone that day and discovered his phone and discovered the image on it, they could have done something, but unfortunately they didn't do something for some time," he said. "In fairness, this began in 2014 so they may have improved their game since then. But in this case it was difficult to see why they didn't act quicker." ®