AI algorithms can help erase bright streaks of internet satellites – but they cannot save astronomy

'We are absolutely losing some science'


Feature Hundreds of scientists around the world have been quietly volunteering their time to prevent low Earth orbit satellites from destroying astronomy.

Space is getting more and more crowded. As technology has advanced, lobbing things into space has become cheaper and more accessible for commercial entities. Private companies are elbowing in and flinging their own satellites into low Earth orbit, typically promising to deliver faster and faster wireless broadband internet from their constellations.

When SpaceX began sending its Starlink birds up in 2018, the astronomy community realized the flying blocks of metal brightened up the night sky and threatened to drown out the glow of distant stars and galaxies. Constellations of Starlink satellites whizzing in front of telescopes left dazzling streaks in their wake, making it difficult for astronomers to observe the cosmos.

The problem is only getting worse. SpaceX now has 1,600-plus internet-relaying satellites in the sky, while similar programs from the likes of Amazon, OneWeb, and Boeing are emerging.

SpaceX has plans to launch 42,000 satellites; Amazon has asked for permission to lob 7,774.

satellite_streak

Bright satellite streaks ruining a view of Perseid meteor shower in 2018 (click to enlarge). Image source: Eckhard Slawik

"We are absolutely losing some science," Jonathan McDowell, an astronomer at the Harvard-Smithsonian Center for Astrophysics, tells The Register. "How much science we lose depends on how many satellites there end up being. You occasionally lose data. At the moment it's one in every ten images."

Telescopes can try waiting for a fleet of satellites to pass before they snap their images, though if astronomers are trying to track moving objects, such as near-Earth asteroids or comets, for example, it can be impossible to avoid the blight.

"As we raise the number of satellites, there starts to be multiple streaks in images you take. That's no longer irritating, you really are losing science. Ten years from now, there may be so many that we can't deal with it," he added.

McDowell co-chaired the Algorithms Group for SATCON2, a workshop hosted by the American Astronomical Society, and warned that scientists need to figure out how to mitigate the issue now when satellite numbers are still low before it's too late to catch up. One possible solution they're starting to explore is machine learning. It's possible AI software can be trained to automatically mask some of the bright satellite streaks in astronomical images.

One of the recommendations in the workshop's giant report [PDF] involves assembling a team of astronomers and computer scientists to develop a range of open-source tools for future researchers to use. In order to build the algorithms, they need to gather a range of datasets made up of images snapped from various telescopes. The shots need to show the same patch of sky with and without satellite trails. Computer-vision algorithms can then be taught to detect the annoying streaks and adjust the pixels to cover them up.

Leaders of the workshop are trying to form collaborations between observatories and secure research funding to seriously develop a central hub for these future tools. At the moment, astronomers interested on working on the problem do it in their spare time or are scattered across various academic projects.

AI cannot do magic

Hossen Teimoorinia, a researcher at the University of Victoria, Canada, has been experimenting with different techniques for a while. "If you want to remove satellite traces to find moving objects you need to prepare a very good dataset," he tells El Reg.

Not only do you have to collect images from observatories and institutions, they need to show exactly the same region of space with and without satellite interference and have to be pre-processed to make sure they're the same size and resolution, and so on. The other possibility is to add fake, artificial trails in clean images of the night sky to increase the amount of training examples.

"It's a little bit time consuming. But hopefully we will be able to train one main model and use transfer learning so it can be fine-tuned to handle different images taken from different telescopes," Teimoorinia says.

It'll be tricky, however, to develop a single model that is robust enough to handle the various properties of different telescopes. They have different resolutions, noise characteristics, exposure times, and operate across different wavelengths. "We may have to build algorithms that work for specific telescopes, it's complicated," McDowell says.

Ideally, these tools will, one day, be packaged as an easy-to-use Python library and astronomers will be able to apply them to their own images.

AI cannot do magic, however, Teimoorinia warns. Some science will still be lost in the process. Even if machine learning can erase the ugly satellite streaks so astronomers can monitor asteroids and comets, any stars or galaxies obstructed by the glinting trails will be removed, too. While you can track asteroids and comets frame by frame as they move across the sky, stars and galaxies tend to remain hidden behind a satellite's trail and will be obliterated during the cleanup.

Seeing things

Don't forget that these constellations of metallic birds reflect sunlight, and their radio signals can interfere with readings, making it potentially difficult for astronomers to accurately record light levels to estimate the distances or temperatures of faraway stars or to discover new galaxies.

Sometimes the opposite can happen, where something glittering in the sky doesn't just make it tough for astronomers to observe objects, it can make them see things that don't even exist.

A flash from the farthest galaxy discovered in the observable universe, GN-z11, generated excitement in the research community. People believed they had spotted the most distant gamma-ray flash ever from an exploding massive dead star or a black hole. But now some reckon it was just the reflection from a fragment of a broken-up spent Russian rocket that happened to be in view at the wrong time as astronomers observed GN-z11.

Similar mistakes could be made in the future with broadband satellites, McDowell says. "A lot of the time the effect of a satellite is really obvious, other times it's more subtle. If the light from a satellite is sent down a fiber for spectroscopy, it can contaminate the spectrum with reflected sunlight from the satellite. It could screw up data without you trying to spot it. Ordinary galaxies suddenly look really interesting, the bright lights make it look like something weird is going on there."

Help keep Earth's night sky dark

It's clear machine-learning-driven image processing simply won't be a panacea. The blight may well need a more drastic measure: limiting the number of low Earth satellites in space altogether. How many is too many? What is the maximum number of satellites that can be in space at any given time to make sure space is still observable?

"That's a wild guess at the moment," Richard Green, an astronomer at the Steward Observatory in the US, tells The Register.

Green believes the United Nations Outer Space Treaty, signed in 1967 to ensure "outer space shall be free for exploration and use by all States" and that "States shall avoid harmful contamination of space and celestial bodies," could be used to regulate global satellite launches in low Earth orbit.

America isn't the only country sending devices into space to provide broadband services. Even if it does try to control the number of satellites going up, it can't solve the problem on its own. "The UK and Canada are doing it too. China as well, although we know less about what's going on there," Green says.

It requires the cooperation of countries all around the world and there has yet to be an all-inclusive international discussion on the matter even though the International Astronomical Union is trying to appeal to the UN's Committee on the Peaceful Uses of Outer Space. "We need to seriously implement new policies or it'll become a free-for-all, where space will be taken by first come, first served," he adds.

Space is for everyone and the discoveries that have been made affect us all, McDowell concludes. "The fundamental things we've learned about ourselves, like the fact that we're all made out of star dust, for example, are immediately relevant. And who knows what we're going to discover or not discover in the next century because of satellites?" ®


Other stories you might like

  • It's 2022 and there are still malware-laden PDFs in emails exploiting bugs from 2017
    Crafty file names, encrypted malicious code, Office flaws – ah, it's like the Before Times

    HP's cybersecurity folks have uncovered an email campaign that ticks all the boxes: messages with a PDF attached that embeds a Word document that upon opening infects the victim's Windows PC with malware by exploiting a four-year-old code-execution vulnerability in Microsoft Office.

    Booby-trapping a PDF with a malicious Word document goes against the norm of the past 10 years, according to the HP Wolf Security researchers. For a decade, miscreants have preferred Office file formats, such as Word and Excel, to deliver malicious code rather than PDFs, as users are more used to getting and opening .docx and .xlsx files. About 45 percent of malware stopped by HP's threat intelligence team in the first quarter of the year leveraged Office formats.

    "The reasons are clear: users are familiar with these file types, the applications used to open them are ubiquitous, and they are suited to social engineering lures," Patrick Schläpfer, malware analyst at HP, explained in a write-up, adding that in this latest campaign, "the malware arrived in a PDF document – a format attackers less commonly use to infect PCs."

    Continue reading
  • New audio server Pipewire coming to next version of Ubuntu
    What does that mean? Better latency and a replacement for PulseAudio

    The next release of Ubuntu, version 22.10 and codenamed Kinetic Kudu, will switch audio servers to the relatively new PipeWire.

    Don't panic. As J M Barrie said: "All of this has happened before, and it will all happen again." Fedora switched to PipeWire in version 34, over a year ago now. Users who aren't pro-level creators or editors of sound and music on Ubuntu may not notice the planned change.

    Currently, most editions of Ubuntu use the PulseAudio server, which it adopted in version 8.04 Hardy Heron, the company's second LTS release. (The Ubuntu Studio edition uses JACK instead.) Fedora 8 also switched to PulseAudio. Before PulseAudio became the standard, many distros used ESD, the Enlightened Sound Daemon, which came out of the Enlightenment project, best known for its desktop.

    Continue reading
  • VMware claims 'bare-metal' performance on virtualized GPUs
    Is... is that why Broadcom wants to buy it?

    The future of high-performance computing will be virtualized, VMware's Uday Kurkure has told The Register.

    Kurkure, the lead engineer for VMware's performance engineering team, has spent the past five years working on ways to virtualize machine-learning workloads running on accelerators. Earlier this month his team reported "near or better than bare-metal performance" for Bidirectional Encoder Representations from Transformers (BERT) and Mask R-CNN — two popular machine-learning workloads — running on virtualized GPUs (vGPU) connected using Nvidia's NVLink interconnect.

    NVLink enables compute and memory resources to be shared across up to four GPUs over a high-bandwidth mesh fabric operating at 6.25GB/s per lane compared to PCIe 4.0's 2.5GB/s. The interconnect enabled Kurkure's team to pool 160GB of GPU memory from the Dell PowerEdge system's four 40GB Nvidia A100 SXM GPUs.

    Continue reading
  • Nvidia promises annual updates across CPU, GPU, and DPU lines
    Arm one year, x86 the next, and always faster than a certain chip shop that still can't ship even one standalone GPU

    Computex Nvidia's push deeper into enterprise computing will see its practice of introducing a new GPU architecture every two years brought to its CPUs and data processing units (DPUs, aka SmartNICs).

    Speaking on the company's pre-recorded keynote released to coincide with the Computex exhibition in Taiwan this week, senior vice president for hardware engineering Brian Kelleher spoke of the company's "reputation for unmatched execution on silicon." That's language that needs to be considered in the context of Intel, an Nvidia rival, again delaying a planned entry to the discrete GPU market.

    "We will extend our execution excellence and give each of our chip architectures a two-year rhythm," Kelleher added.

    Continue reading
  • Amazon puts 'creepy' AI cameras in UK delivery vans
    Big Bezos is watching you

    Amazon is reportedly installing AI-powered cameras in delivery vans to keep tabs on its drivers in the UK.

    The technology was first deployed, with numerous errors that reportedly denied drivers' bonuses after malfunctions, in the US. Last year, the internet giant produced a corporate video detailing how the cameras monitor drivers' driving behavior for safety reasons. The same system is now apparently being rolled out to vehicles in the UK. 

    Multiple camera lenses are placed under the front mirror. One is directed at the person behind the wheel, one is facing the road, and two are located on either side to provide a wider view. The cameras are monitored by software built by Netradyne, a computer-vision startup focused on driver safety. This code uses machine-learning algorithms to figure out what's going on in and around the vehicle.

    Continue reading

Biting the hand that feeds IT © 1998–2022