Imagine surviving WW3, rebuilding computers, opening up GitHub's underground vault just to relive JavaScript

We've heard of a code freeze but this is ridiculous: Microsoft finishes burying repos in Norwegian archipelago


Microsoft's GitHub on Thursday said that earlier this month it successfully deposited a snapshot of recently active GitHub public code repositories to an underground vault on the Norwegian archipelago of Svalbard.

GitHub captured every repo with at least one star and any commits dating back a year from February 2, 2020, and every repo with at least 250 stars, in an archival snapshot. The copied code consists of the HEAD of the default branch of each repo, apart from binaries exceeding 100KB, packaged as a single TAR file.

GitHub says most of the data has been stored QR-encoded and compressed. A human-readable index and guide have been stored too, to describe the location of each repository and to offer advice on recovery.

The data amounts to 21TB of code, written to 186 reels of piqlFilm, a digital archiving format that uses tape to allow files to be read offline. Boxes of these reels were delivered to Svalbard on July 8, 2020, to an underground storage vault that's similar in concept to the Svalbard Global Seed Vault though less plausibly useful.

"The code landed in Longyearbyen, a town of a few thousand people on Svalbard, where our boxes were met by a local logistics company and taken into intermediate secure storage overnight," said Julia Metcalf, director of strategic programs at GitHub, in a blog post.

"The next morning," she said, "it traveled to the decommissioned coal mine set in the mountain, and then to a chamber deep inside hundreds of meters of permafrost, where the code now resides fulfilling their mission of preserving the world’s open source code for over 1,000 years."

That may be a bit ambitious. PiqlFilm advertises a 500-year lifespan. But GitHub's blog post isn't likely to get the same preservation for posterity, so who will ever know?

GitHub staffers had planned to accompany the boxed reels on their journey to the Arctic but abandoned the junket in light of the coronavirus pandemic.

Cat yawning photo via Shutterstock

GitHub is just like all of us: The week has just started but it needed 4 whole hours of downtime

READ MORE

The code-storage biz originally announced its plans for the Arctic Code Vault at its Universe 2019 shindig in November, 2019. The project is part of the GitHub Archive Program, an initiative the company has characterized as a way to preserve open source code for generations to come.

At the time, David Rosenthal, a veteran of Sun Microsystems and Nvidia and the co-creator of Stanford's LOCKSS [Lots Of Copies Keeps Stuff Safe] digital preservation program, expressed skepticism that anyone beyond the current generation will ever find the code useful. Nonetheless, he endorsed the spirit of the venture.

"No-one will decode this archive in the foreseeable future," he wrote in a blog post last year. "It is a PR stunt, or perhaps more accurately a koan, like the golden records of Voyager 1 and 2, or the Long Now's clock, to get people to think about the importance of the long term."

GitHub in fact is thinking more broadly about code preservation than just burying boxes in a Norwegian mine. The Arctic Code Vault is just one aspect of the GitHub Archive Program, which also encompasses "hot," "warm," and "cold," backup sources, where temperature refers to the frequency of updates.

Thus, GH Torrent and GH Archive provide "hot" storage that gets updated with recent events like pull requests. The Internet Archive and the Software Heritage Foundation provide "warm" GitHub archives that get updated occasionally.

Then there's "cold" storage like the Arctic Code Vault and Oxford University’s Bodleian Library, which will house a copy of the Svalbard data. Bugs in cold storage will be preserved in perpetuity or until cataclysm, whichever comes first.

Finally, there's the Microsoft Project Silica initiative to write data to quartz glass platters with a femtosecond laser in the hope it will last 10,000 years.

Here's to 10 millennia more of JQuery. ®

Broader topics


Other stories you might like

  • New audio server Pipewire coming to next version of Ubuntu
    What does that mean? Better latency and a replacement for PulseAudio

    The next release of Ubuntu, version 22.10 and codenamed Kinetic Kudu, will switch audio servers to the relatively new PipeWire.

    Don't panic. As J M Barrie said: "All of this has happened before, and it will all happen again." Fedora switched to PipeWire in version 34, over a year ago now. Users who aren't pro-level creators or editors of sound and music on Ubuntu may not notice the planned change.

    Currently, most editions of Ubuntu use the PulseAudio server, which it adopted in version 8.04 Hardy Heron, the company's second LTS release. (The Ubuntu Studio edition uses JACK instead.) Fedora 8 also switched to PulseAudio. Before PulseAudio became the standard, many distros used ESD, the Enlightened Sound Daemon, which came out of the Enlightenment project, best known for its desktop.

    Continue reading
  • VMware claims 'bare-metal' performance on virtualized GPUs
    Is... is that why Broadcom wants to buy it?

    The future of high-performance computing will be virtualized, VMware's Uday Kurkure has told The Register.

    Kurkure, the lead engineer for VMware's performance engineering team, has spent the past five years working on ways to virtualize machine-learning workloads running on accelerators. Earlier this month his team reported "near or better than bare-metal performance" for Bidirectional Encoder Representations from Transformers (BERT) and Mask R-CNN — two popular machine-learning workloads — running on virtualized GPUs (vGPU) connected using Nvidia's NVLink interconnect.

    NVLink enables compute and memory resources to be shared across up to four GPUs over a high-bandwidth mesh fabric operating at 6.25GB/s per lane compared to PCIe 4.0's 2.5GB/s. The interconnect enabled Kurkure's team to pool 160GB of GPU memory from the Dell PowerEdge system's four 40GB Nvidia A100 SXM GPUs.

    Continue reading
  • Nvidia promises annual updates across CPU, GPU, and DPU lines
    Arm one year, x86 the next, and always faster than a certain chip shop that still can't ship even one standalone GPU

    Computex Nvidia's push deeper into enterprise computing will see its practice of introducing a new GPU architecture every two years brought to its CPUs and data processing units (DPUs, aka SmartNICs).

    Speaking on the company's pre-recorded keynote released to coincide with the Computex exhibition in Taiwan this week, senior vice president for hardware engineering Brian Kelleher spoke of the company's "reputation for unmatched execution on silicon." That's language that needs to be considered in the context of Intel, an Nvidia rival, again delaying a planned entry to the discrete GPU market.

    "We will extend our execution excellence and give each of our chip architectures a two-year rhythm," Kelleher added.

    Continue reading
  • Amazon puts 'creepy' AI cameras in UK delivery vans
    Big Bezos is watching you

    Amazon is reportedly installing AI-powered cameras in delivery vans to keep tabs on its drivers in the UK.

    The technology was first deployed, with numerous errors that reportedly denied drivers' bonuses after malfunctions, in the US. Last year, the internet giant produced a corporate video detailing how the cameras monitor drivers' driving behavior for safety reasons. The same system is now apparently being rolled out to vehicles in the UK. 

    Multiple camera lenses are placed under the front mirror. One is directed at the person behind the wheel, one is facing the road, and two are located on either side to provide a wider view. The cameras are monitored by software built by Netradyne, a computer-vision startup focused on driver safety. This code uses machine-learning algorithms to figure out what's going on in and around the vehicle.

    Continue reading
  • AWS puts latest homebrew ‘Graviton 3’ Arm CPU in production
    Just one instance type for now, but cheaper than third-gen Xeons or EPYCs

    Amazon Web Services has made its latest homebrew CPU, the Graviton3, available to rent in its Elastic Compute Cloud (EC2) infrastructure-as-a-service offering.

    The cloud colossus launched Graviton3 at its late 2021 re:Invent conference, revealing that the 55-billion-transistor device includes 64 cores, runs at 2.6GHz clock speed, can address DDR5 RAM and 300GB/sec max memory bandwidth, and employs 256-bit Scalable Vector Extensions.

    The chips were offered as a tech preview to select customers. And on Monday, AWS made them available to all comers in a single instance type named C7g.

    Continue reading

Biting the hand that feeds IT © 1998–2022