Autonomous space robots are going to be key to making new discoveries and exploring the furthest reaches of our Solar System and beyond, according to NASA scientists.
“By making their own exploration decisions, robotic spacecraft can conduct traditional science investigations more efficiently and even achieve otherwise impossible observations,” Steve Chien and Kiri Wagstaff, AI researchers working at NASA’s Jet Propulsion Laboratory, wrote in Science Robotics.
Autonomy will allow robots to respond and turn their attention to sudden, unexpected phenomena like the plumes sprouting from distant comets, instead of waiting around to execute the next command sent from Earth.
AI and machine learning has a long history at NASA. It’s tough to pinpoint the exact time the technology was used, but Chien said the first time it cropped up onboard a spacecraft was 1999 with the Deep Space One (DS1) Remote Agent Experiment (RAX).
DS1 was a spacecraft that was used to perform a demo of the RAX to test how it could generate its own plans to achieve mission goals over a 48-hour flight.
More modern spacecraft orbiting Earth employ machine learning classifier algorithms to distinguish between snow, water and ice so they can detect more unusual weather events like volcanic activity, fires or floods. The same principle is also used on the Curiosity rover to capture whirling dust devils, kicked up by the Martian wind.
The idea of robots coming up with its own schedule is particularly interesting to Chien as it enables a higher level of autonomy and intelligence.
“Understanding the competing objectives, and measurements you are trying to do, so that you can design software that can pack it all in, is quite a challenge. A lot of times it gets quite involved in the science - what you are trying to model, what you are trying to observe - whether it is a plume in the ocean, a the evolution of a volcanic eruption, or the geology behind how a particular site evolved,” Chien told The Register.
All of the planning systems rely on modelling the spacecraft’s current state and resources and use better search algorithms to decide on a schedule.
NASA is currently developing an automated scheduler for its Mars 2020 rover mission. But to go further, robots will have to be able to explore unknown environments for days, weeks or even months without human support.
There is an ongoing project exploring potential technologies for autonomous submarines to detect signs of life underwater. It is hoped that one day, such submersibles could be used to probe the oceans hidden beneath the icy exteriors on Europa, Enceladus or Pluto.
It is believed that these watery world may have hydrothermal vents that support life, like the extremophile microbes that live near similar vents in Earth’s oceans - a possible hotbed that could explain the origin of life.
NASA are also looking at newer areas of AI like deep learning. It’s important to keep learning to achieve longer term goals, Chien said. But applying them is trickier since space missions are very expensive. “We have few opportunities to launch [robots], so NASA does not want to take unnecessary risks, so most of the machine learning deployments are on the ground.”
The ultimate challenge would be to visit Alpha Centauri, the nearest star system to the Solar System - only 4.37 light years away. Last year, scientists announced that Proxima b, a possible rocky planet, was orbiting in the habitable zone around Alpha Centauri.
“To traverse a distance of over 4 light years, an explorer to this system would likely endure a cruise of over 60 years. Upon arrival, the spacecraft would need to operate independently for years, even decades, exploring multiple planets in the system. Today’s AI innovations are paving the way to make this kind of autonomy a reality,” the paper said. ®