Edgy: HPE's first message from the International Space Station to Microsoft's Azure? 'hello world'

Supercomputing above the clouds


Microsoft and HPE were cock-a-hoop yesterday with the trumpeting of data bursts from HPE hardware aboard the International Space Station (ISS) to Microsoft's Azure, starting with the inevitable "hello world".

HPE's first spaceborne computer, based on an Apollo 40-class system, returned to Earth in 2019 after nearly two years aboard the ISS. The second went up earlier this year, replete with Red Hat 7.8 to keep things ticking over and software to handle any failures when components receive a zapping in the harsh environment aboard the ISS.

Oddly, Microsoft's bragging on the matter yesterday concerning data bursts pumped from HPE's hardware into Microsoft's cloud failed to mention that Linux was running the show. Windows for Space Stations, anyone?

HPE Spaceborne-2 (pic: NASA)

How are your cabling skills? (pic: NASA)

At the time of launch, the thinking was that Azure Space might be used for the occasional workload. After all, one of the goals of Spaceborne missions is to demonstrate onboard autonomy and Spaceborne-2 upped the hardware specs with GPUS aimed at machine learning and image processing as well as Cascade Lake Intel Xeon processors (an improvement on the Broadwells of before).

Shoving HPE's Edgeline Converged EL4000 Edge System into the mix is the ultimate example of edge computing, and usage examples were given including having the hardware assess traffic trends, pollution, and missile launches.

However, Spaceborne-2 has also demonstrated the ability to offload computations to Azure, making best use of the meagre bandwidth (two hours per week with a maximum speed of 250 kilobytes per second, according to Microsoft) to send only the data that needs the extra scrutiny afforded by Microsoft's Azure computers.

Dr Mark Fernandez, principal investigator for Spaceborne Computer-2 at HPE, told The Register: "I envision there will always be a need for processing at the 'core'. i.e. edge-to-core or edge-to-cloud. But the future holds two promises. One is that edge processing will become more powerful and the need to process at the core will be reduced. Hence, self-sufficient computers enable self-sufficient explorers.

"Secondly, orbiting 'gateways' or next-gen Space Stations around the Moon and Mars may house supercomputing, AI/ML, Quantum, and other capabilities sufficient to address the requirements."

Thus far, HPE has completed four experiments (including shunting "hello world" down to Microsoft's cloud). Four others are under way and 29 more are in the queue.

The clock is ticking, however. The US portion of the ISS is only funded through 2024, although an extension seems likely unless somebody sends the station off into another spin with a borked science module.

"We've got to get as much as we can done in the time we have left, " said Christine Kretz, vice president of programs and partnerships at the International Space Station US National Laboratory. ®


Other stories you might like

  • Google sours on legacy G Suite freeloaders, demands fee or flee

    Free incarnation of online app package, which became Workplace, is going away

    Google has served eviction notices to its legacy G Suite squatters: the free service will no longer be available in four months and existing users can either pay for a Google Workspace subscription or export their data and take their not particularly valuable businesses elsewhere.

    "If you have the G Suite legacy free edition, you need to upgrade to a paid Google Workspace subscription to keep your services," the company said in a recently revised support document. "The G Suite legacy free edition will no longer be available starting May 1, 2022."

    Continue reading
  • SpaceX Starlink sat streaks now present in nearly a fifth of all astronomical images snapped by Caltech telescope

    Annoying, maybe – but totally ruining this science, maybe not

    SpaceX’s Starlink satellites appear in about a fifth of all images snapped by the Zwicky Transient Facility (ZTF), a camera attached to the Samuel Oschin Telescope in California, which is used by astronomers to study supernovae, gamma ray bursts, asteroids, and suchlike.

    A study led by Przemek Mróz, a former postdoctoral scholar at the California Institute of Technology (Caltech) and now a researcher at the University of Warsaw in Poland, analysed the current and future effects of Starlink satellites on the ZTF. The telescope and camera are housed at the Palomar Observatory, which is operated by Caltech.

    The team of astronomers found 5,301 streaks leftover from the moving satellites in images taken by the instrument between November 2019 and September 2021, according to their paper on the subject, published in the Astrophysical Journal Letters this week.

    Continue reading
  • AI tool finds hundreds of genes related to human motor neuron disease

    Breakthrough could lead to development of drugs to target illness

    A machine-learning algorithm has helped scientists find 690 human genes associated with a higher risk of developing motor neuron disease, according to research published in Cell this week.

    Neuronal cells in the central nervous system and brain break down and die in people with motor neuron disease, like amyotrophic lateral sclerosis (ALS) more commonly known as Lou Gehrig's disease, named after the baseball player who developed it. They lose control over their bodies, and as the disease progresses patients become completely paralyzed. There is currently no verified cure for ALS.

    Motor neuron disease typically affects people in old age and its causes are unknown. Johnathan Cooper-Knock, a clinical lecturer at the University of Sheffield in England and leader of Project MinE, an ambitious effort to perform whole genome sequencing of ALS, believes that understanding how genes affect cellular function could help scientists develop new drugs to treat the disease.

    Continue reading

Biting the hand that feeds IT © 1998–2022