Autonomous spacecraft? Baby steps


Robot and astronaut meeting in space illustration
|

Robotic rovers trundle across the Martian highlands, eyeing the terrain and stopping to scoop up promising samples. A spacecraft powers across the cosmos while the crew sleeps. On the moon, a habitat’s brain automatically notices CO2 levels are off and adjusts them.

It’s the stuff of science fiction. And it’s going to stay that way for awhile, said a NASA computer scientist working on putting artificial intelligence into space.

NASA has taken a few tiny steps in deploying AI, but robots exploring on their own and self-piloting spacecraft are a long way off, said Jeremy Frank, group lead of the planning and scheduling group in the Intelligent Systems Division at NASA Ames Research Center. He spoke to an Arizona State University audience Tuesday in a talk sponsored by the School of Computing, Informatics, and Decision Systems Engineering AI Group, in the Ira A. Fulton Schools of Engineering.

A few small subsystems have been tested in flight, including water purity testing machines, a warning system on the Orion crew capsule and a management system for laptops on the International Space Station, which are critical pieces of equipment.

(Fun fact: 180,000 pieces of information come down from the ISS every day, not counting payload data, according to Frank).

Frank works on the development of automated planning and scheduling systems for use in space mission operations; the integration of technologies for planning, plan execution, and fault detection for space applications; and the development of technology to enable astronauts to autonomously operate spacecraft.

The Artemis program — getting humans to the moon by 2024 — is a stepping stone to Mars, but it’s going to use many of the same technologies.

That technology will require human interfaces and flight software integration.

“If we’re going to make our spacecraft autonomous, that’s what we’re going to have to have,” Frank said.

He predicted that will involve a combination of high performance computing and machine learning. “We’re not going to have our spacecraft learn” in flight, he said. (There’s also no way a flagship mission is going to go without ground communication, he added.)

How far has NASA come? 

A warning system was tested on the first flight of the Orion crew capsule six years ago. A test project called the Advanced Caution and Warning Systems monitored the health of the Orion’s critical vehicle systems using live data transmitted to the ground during the flight test.

The system was designed to monitor the mission from launch through splashdown to display information about failures and provide information on the effects of failures. The team demonstrated future Mission Control Center and onboard displays that maximized awareness of what was happening during failures. It determined the cause of failures and spotted components affected by failures to provide a comprehensive view of the health of the spacecraft. It also assisted operators using “what-if” queries that identify next-worst failures to help operations teams be prepared for the most critical system issues.

Future human space missions will put crews far from Earth. The one-way light-time delay to the moon is 1.2 seconds. That’s enough to make continuous control from Earth difficult to impossible. The same delay to Mars ranges from 3 minutes to 22 minutes, making communication exponentially more difficult.

“These missions will require changing the capabilities of spacecraft, the roles and responsibilities of ground and crew, and the ways that ground and crew interact during the mission,” a NASA press release said.

“These conversations put into perspective how much work we have to do,” Frank said.

Top illustration: Alex Davis, Media Relations and Strategic Communications

More Science and technology

 

Asteroid in the night sky above sandy dunes.

ASU microscopes help solve decades-old asteroid-impact deposit mystery

Axel Wittmann had always had “a fondness for exotic rocks,” as he puts it, his favorite being suevite, formed from intense meteorite collisions. But in 2009, when he met fellow geologist Philippe…

A portrait of U.S. Space Force Maj. Tyler T. Williams.

Major in motion

Inside a dimly lit computer lab at Arizona State University, U.S. Space Force Maj. Tyler Williams leans over a glowing monitor, lines of simulated network traffic scrolling by faster than most eyes…

Outline of a head with arrows emerging from it on a pale green background.

New study uncovers another role for the cerebellum, offering clues about autism

There is a window of time, a critical period, during infancy and early childhood when the brain learns how to process information — what different objects look like, parsing sounds that make up…