When Google sneezes, the internet gets flu

Algorithms don't help solve the net's biggest problems


We can remember it for you wholesale

The second issue is so unmentionable it's rarely raised in polite company. Again, it's an area in which we find Google not to be the unique company facing the problem - but uniquely ill-equipped to deal with it.

It's the question of how much we are prepared to disclose to an anonymous, friendly looking computer system.

Earlier this year, AOL Research quite deliberately and not without some pride, released the search queries of more than half a million users. It wasn't long before the anonymised queries were matched to their authors. AOL then expressed its horror, and dismissed the staff who it had encouraged to disclose the information (a fine example in its own right of corporate responsibility).

But the cat was out of the bag. The story confirmed that people today choose to disclose information to Google that they wouldn't tell their husband or wife - and a search engine never forgets this intimate knowledge. They would almost certainly not disclose it if they thought it would be released - or available for casual perusal by cops (which it is).

Similarly, bloggers who blurt away only to be "discovered" are often shocked to learn their writing was visible. Did they think they had some super-selective invisibility cloak?

Technology evangelists of a utopian bent - people who believe this great detritus of disclosure now being collected by information systems such as Google will prove to be of great importance to us - argue that the future is safe. We'll adapt to the machines, they say. But history tells us that the opposite is true: computer systems that fail to be trusted, fail to be used.

Google's own experience corroborates this.

Usenet is a medium not too dissimilar to many blogs today, where people wrote informally. The half-life of a Usenet posting was several weeks - it depended on the popularity of the newsgroup - but in most cases archives were not maintained, or easily accessible. And Usenet continued in rude good health long after its demise was predicted. Then, Google took over the creaking archive from Deja, and made searching the entire historical record trivially easy. People simply stopped using it. Usenet died the day Google turned it into a database.

Technical assurances are little use when the trust is lost - or was never there in the first place.

In another example, few countries think a nationally-accessible medical records database is a good idea, and the revolt in the UK against the proposal to create one has led to an opt-out campaign and some (cosmetic) concessions from the government.

People expect records to be computerised - but they don't want them to be confidential on a need-to-know basis - and not available to any browsing employer, insurance company, bureaucrat, or cop, which they effectively would be given the current state of security.

The saga briefly resurrected HR.4731, Rep Markey's bill to "Eliminate Warehousing of Consumer Internet Data". Then Markey went back to lobbying on behalf of Google for "net neutrality" - and the bill remained stalled.

Google's response to this issue has been a public relations offensive and some muscular lobbying - to prevent more bills like Markey's.

"We are reasonably satisfied...that this sort of thing would not happen at Google, although you can never say never," was Google CEO Eric Schmidt's response to the AOL scandal.

Obsessively secretive, and determined to hoard every piece of data it mines from us, Google appears ill-equipped to restore confidence in the relationship between surfers and the systems we use.

The company initially fended off attention by pointing to its inate goodness. These days it points to its own cleverness. Neither virtue nor engineering talent can solve either problem, however.

Regulation looms in both areas. ®


Other stories you might like

  • 381,000-plus Kubernetes API servers 'exposed to internet'
    Firewall isn't a made-up word from the Hackers movie, people

    A large number of servers running the Kubernetes API have been left exposed to the internet, which is not great: they're potentially vulnerable to abuse.

    Nonprofit security organization The Shadowserver Foundation recently scanned 454,729 systems hosting the popular open-source platform for managing and orchestrating containers, finding that more than 381,645 – or about 84 percent – are accessible via the internet to varying degrees thus providing a cracked door into a corporate network.

    "While this does not mean that these instances are fully open or vulnerable to an attack, it is likely that this level of access was not intended and these instances are an unnecessarily exposed attack surface," Shadowserver's team stressed in a write-up. "They also allow for information leakage on version and build."

    Continue reading
  • A peek into Gigabyte's GPU Arm for AI, HPC shops
    High-performance platform choices are going beyond the ubiquitous x86 standard

    Arm-based servers continue to gain momentum with Gigabyte Technology introducing a system based on Ampere's Altra processors paired with Nvidia A100 GPUs, aimed at demanding workloads such as AI training and high-performance compute (HPC) applications.

    The G492-PD0 runs either an Ampere Altra or Altra Max processor, the latter delivering 128 64-bit cores that are compatible with the Armv8.2 architecture.

    It supports 16 DDR4 DIMM slots, which would be enough space for up to 4TB of memory if all slots were filled with 256GB memory modules. The chassis also has space for no fewer than eight Nvidia A100 GPUs, which would make for a costly but very powerful system for those workloads that benefit from GPU acceleration.

    Continue reading
  • GitLab version 15 goes big on visibility and observability
    GitOps fans can take a spin on the free tier for pull-based deployment

    One-stop DevOps shop GitLab has announced version 15 of its platform, hot on the heels of pull-based GitOps turning up on the platform's free tier.

    Version 15.0 marks the arrival of GitLab's next major iteration and attention this time around has turned to visibility and observability – hardly surprising considering the acquisition of OpsTrace as 2021 drew to a close, as well as workflow automation, security and compliance.

    GitLab puts out monthly releases –  hitting 15.1 on June 22 –  and we spoke to the company's senior director of Product, Kenny Johnston, at the recent Kubecon EU event, about what will be added to version 15 as time goes by. During a chat with the company's senior director of Product, Kenny Johnston, at the recent Kubecon EU event, The Register was told that this was more where dollars were being invested into the product.

    Continue reading

Biting the hand that feeds IT © 1998–2022