Smash up your kid's Bluetooth-connected Cayla 'surveillance' doll, Germany urges parents

Or switch it off, bin it, bury it, whatever's necessary


Germany's Federal Network Agency, or Bundesnetzagentur, has banned Genesis Toys' Cayla doll as an illegal surveillance device.

"Items that conceal cameras or microphones and that are capable of transmitting a signal, and therefore can transmit data without detection, compromise people's privacy," said agency president Jochen Homann in a statement. "This applies in particular to children's toys. The Cayla doll has been banned in Germany."

Calya's deportation and exile comes two months after privacy advocacy groups urged US and EU regulators to deal with the potentially privacy-infringing doll.

The Bluetooth-enabled toy comes with a microphone and is designed to capture children's speech so it can be analyzed using Nuance's speech recognition software, in conjunction with mobile apps.

Privacy and consumer protection groups have complained that the doll has been programmed to advertise to children, lacks security, and provides insufficient privacy guarantees about how captured data and personal information will be used.

Neither Genesis Toys, the Hong Kong-based maker of the doll, nor Nuance responded to requests for comment.

Germany's network watchdog said any toy capable of transmitting signals and surreptitiously recording audio or video without detection is unlawful. The danger, the agency claims, is that anything a child or someone else says in the vicinity of the doll can be transmitted without parents' knowledge. Also, lack of network security could allow the toy to be turned into a listening device, the agency suggests.

UK-based security research group Pen Test Partners has demonstrated that the toy's local database can be hacked. It also suggests the doll is vulnerable to man-in-the-middle attacks, a backdoor attack, and pairing with an arbitrary Bluetooth device. The firm refers to Cayla as "a bluetooth headset, dressed up as a doll."

Along similar lines, other tech-enabled toys, like Mattel's Hello Barbie doll in 2015, have been shown to lack adequate cybersecurity controls.

The agency's rules state that buyers of unlawful espionage devices may be required to destroy them and to provide proof of destruction in the form of a confirmation letter from a waste management facility.

In what might be read as an effort to encourage parents to destroy the doll, the Bundesnetzagentur says it assumes that "parents will take it upon themselves to make sure the doll does not pose a risk." However, its product notice also makes clear that the agency has "no plans at present to instigate any regulatory proceedings against the parents."

So any violence against Cayla is strictly discretionary. ®

Broader topics


Other stories you might like

  • Robotics and 5G to spur growth of SoC industry – report
    Big OEMs hogging production and COVID causing supply issues

    The system-on-chip (SoC) side of the semiconductor industry is poised for growth between now and 2026, when it's predicted to be worth $6.85 billion, according to an analyst's report. 

    Chances are good that there's an SoC-powered device within arm's reach of you: the tiny integrated circuits contain everything needed for a basic computer, leading to their proliferation in mobile, IoT and smart devices. 

    The report predicting the growth comes from advisory biz Technavio, which looked at a long list of companies in the SoC market. Vendors it analyzed include Apple, Broadcom, Intel, Nvidia, TSMC, Toshiba, and more. The company predicts that much of the growth between now and 2026 will stem primarily from robotics and 5G. 

    Continue reading
  • Deepfake attacks can easily trick live facial recognition systems online
    Plus: Next PyTorch release will support Apple GPUs so devs can train neural networks on their own laptops

    In brief Miscreants can easily steal someone else's identity by tricking live facial recognition software using deepfakes, according to a new report.

    Sensity AI, a startup focused on tackling identity fraud, carried out a series of pretend attacks. Engineers scanned the image of someone from an ID card, and mapped their likeness onto another person's face. Sensity then tested whether they could breach live facial recognition systems by tricking them into believing the pretend attacker is a real user.

    So-called "liveness tests" try to authenticate identities in real-time, relying on images or video streams from cameras like face recognition used to unlock mobile phones, for example. Nine out of ten vendors failed Sensity's live deepfake attacks.

    Continue reading
  • Lonestar plans to put datacenters in the Moon's lava tubes
    How? Founder tells The Register 'Robots… lots of robots'

    Imagine a future where racks of computer servers hum quietly in darkness below the surface of the Moon.

    Here is where some of the most important data is stored, to be left untouched for as long as can be. The idea sounds like something from science-fiction, but one startup that recently emerged from stealth is trying to turn it into a reality. Lonestar Data Holdings has a unique mission unlike any other cloud provider: to build datacenters on the Moon backing up the world's data.

    "It's inconceivable to me that we are keeping our most precious assets, our knowledge and our data, on Earth, where we're setting off bombs and burning things," Christopher Stott, founder and CEO of Lonestar, told The Register. "We need to put our assets in place off our planet, where we can keep it safe."

    Continue reading

Biting the hand that feeds IT © 1998–2022