Joint UK-Oz probe finds face-recognition upstart Clearview AI is rubbish at privacy

Brit watchdog considering next steps, Australia's orders deletion of scraped image trove

A joint probe conducted by the Australian Information Commissioner (OAIC) and the UK Information Commissioner's Office (ICO) has found that facial-recognition-as-a-service company Clearview AI breached Australian privacy laws.

Clearview AI harvests photos from the public internet, uses AI to identify the people depicted, then offers law enforcement agencies a search engine to help ID suspects.

The firm proudly touts scenarios such as using a profile pic on a suspected criminal's Facebook page to match "a photo of the suspect with his real name at a local country club." Once the suspect's name was known, the firm says the investigation proceeded more quickly and easily than would otherwise have been possible.

The Info Commissioners of Australia and the UK promised to investigate Clearview's image-scraping practices in July 2020. Australia's OAIC delivered its findings today and they're damning: ClearView collected personal information unfairly and without consent, failed to inform those whose photos it identified, didn't bother checking if its assessments were accurate, and paid scant attention to Australian privacy law.

"When Australians use social media or professional networking sites, they don't expect their facial images to be collected without their consent by a commercial entity to create biometric templates for completely unrelated identification purposes," said Australian information commissioner and privacy commissioner Angelene Falk.

In her full report (PDF), Falk rejects Clearview's argument that it is exempt from Australian law because downloading images from the USA doesn't equate to doing business down under.

The document offers considerable detail about how and why the company's activities do in fact represent a breach of Australia's privacy laws and principles. Falk declared that Clearview must stop scraping images depicting Australians and destroy those it has already collected.

Clearview has since stopped offering its services in Australia and made its new business inquiry web page inaccessible from Australian IP addresses.

The commissioner also had some choice words for social networks.

"This case reinforces the need to strengthen protections through the current review of the Privacy Act, including restricting or prohibiting practices such as data scraping personal information from online platforms.

"It also raises questions about whether online platforms are doing enough to prevent and detect scraping of personal information."

Clearview, meanwhile, sails on serenely. The biz recently proudly revealed it has surpassed ten billion images in its databases, and has improved its ability to work with blurry images or photos depicting people wearing masks. ®

Other stories you might like

  • Robotics and 5G to spur growth of SoC industry – report
    Big OEMs hogging production and COVID causing supply issues

    The system-on-chip (SoC) side of the semiconductor industry is poised for growth between now and 2026, when it's predicted to be worth $6.85 billion, according to an analyst's report. 

    Chances are good that there's an SoC-powered device within arm's reach of you: the tiny integrated circuits contain everything needed for a basic computer, leading to their proliferation in mobile, IoT and smart devices. 

    The report predicting the growth comes from advisory biz Technavio, which looked at a long list of companies in the SoC market. Vendors it analyzed include Apple, Broadcom, Intel, Nvidia, TSMC, Toshiba, and more. The company predicts that much of the growth between now and 2026 will stem primarily from robotics and 5G. 

    Continue reading
  • Deepfake attacks can easily trick live facial recognition systems online
    Plus: Next PyTorch release will support Apple GPUs so devs can train neural networks on their own laptops

    In brief Miscreants can easily steal someone else's identity by tricking live facial recognition software using deepfakes, according to a new report.

    Sensity AI, a startup focused on tackling identity fraud, carried out a series of pretend attacks. Engineers scanned the image of someone from an ID card, and mapped their likeness onto another person's face. Sensity then tested whether they could breach live facial recognition systems by tricking them into believing the pretend attacker is a real user.

    So-called "liveness tests" try to authenticate identities in real-time, relying on images or video streams from cameras like face recognition used to unlock mobile phones, for example. Nine out of ten vendors failed Sensity's live deepfake attacks.

    Continue reading
  • Lonestar plans to put datacenters in the Moon's lava tubes
    How? Founder tells The Register 'Robots… lots of robots'

    Imagine a future where racks of computer servers hum quietly in darkness below the surface of the Moon.

    Here is where some of the most important data is stored, to be left untouched for as long as can be. The idea sounds like something from science-fiction, but one startup that recently emerged from stealth is trying to turn it into a reality. Lonestar Data Holdings has a unique mission unlike any other cloud provider: to build datacenters on the Moon backing up the world's data.

    "It's inconceivable to me that we are keeping our most precious assets, our knowledge and our data, on Earth, where we're setting off bombs and burning things," Christopher Stott, founder and CEO of Lonestar, told The Register. "We need to put our assets in place off our planet, where we can keep it safe."

    Continue reading

Biting the hand that feeds IT © 1998–2022