Autonomous spacecraft? Baby steps


Robot and astronaut meeting in space illustration
|

Robotic rovers trundle across the Martian highlands, eyeing the terrain and stopping to scoop up promising samples. A spacecraft powers across the cosmos while the crew sleeps. On the moon, a habitat’s brain automatically notices CO2 levels are off and adjusts them.

It’s the stuff of science fiction. And it’s going to stay that way for awhile, said a NASA computer scientist working on putting artificial intelligence into space.

NASA has taken a few tiny steps in deploying AI, but robots exploring on their own and self-piloting spacecraft are a long way off, said Jeremy Frank, group lead of the planning and scheduling group in the Intelligent Systems Division at NASA Ames Research Center. He spoke to an Arizona State University audience Tuesday in a talk sponsored by the School of Computing, Informatics, and Decision Systems Engineering AI Group, in the Ira A. Fulton Schools of Engineering.

A few small subsystems have been tested in flight, including water purity testing machines, a warning system on the Orion crew capsule and a management system for laptops on the International Space Station, which are critical pieces of equipment.

(Fun fact: 180,000 pieces of information come down from the ISS every day, not counting payload data, according to Frank).

Frank works on the development of automated planning and scheduling systems for use in space mission operations; the integration of technologies for planning, plan execution, and fault detection for space applications; and the development of technology to enable astronauts to autonomously operate spacecraft.

The Artemis program — getting humans to the moon by 2024 — is a stepping stone to Mars, but it’s going to use many of the same technologies.

That technology will require human interfaces and flight software integration.

“If we’re going to make our spacecraft autonomous, that’s what we’re going to have to have,” Frank said.

He predicted that will involve a combination of high performance computing and machine learning. “We’re not going to have our spacecraft learn” in flight, he said. (There’s also no way a flagship mission is going to go without ground communication, he added.)

How far has NASA come? 

A warning system was tested on the first flight of the Orion crew capsule six years ago. A test project called the Advanced Caution and Warning Systems monitored the health of the Orion’s critical vehicle systems using live data transmitted to the ground during the flight test.

The system was designed to monitor the mission from launch through splashdown to display information about failures and provide information on the effects of failures. The team demonstrated future Mission Control Center and onboard displays that maximized awareness of what was happening during failures. It determined the cause of failures and spotted components affected by failures to provide a comprehensive view of the health of the spacecraft. It also assisted operators using “what-if” queries that identify next-worst failures to help operations teams be prepared for the most critical system issues.

Future human space missions will put crews far from Earth. The one-way light-time delay to the moon is 1.2 seconds. That’s enough to make continuous control from Earth difficult to impossible. The same delay to Mars ranges from 3 minutes to 22 minutes, making communication exponentially more difficult.

“These missions will require changing the capabilities of spacecraft, the roles and responsibilities of ground and crew, and the ways that ground and crew interact during the mission,” a NASA press release said.

“These conversations put into perspective how much work we have to do,” Frank said.

Top illustration: Alex Davis, Media Relations and Strategic Communications

More Science and technology

 

A graphic announcing the "cool" products of TOMNET with people working in the foreground and computer screens with data in the background.

ASU travel behavior research center provides insights on the future of transportation

The Center for Teaching Old Models New Tricks, known as TOMNET, has spent the past seven years conducting research and developing tools to improve transportation systems planning methods and data.As…

Illustration of a line up with four black silhouettes and one maroon silhouette

When suspect lineups go wrong

It is one of the most famous cases of eyewitness misidentification.In 1984, Jennifer Thompson was raped at knifepoint by a man who broke into her apartment. During the assault, she tried to make a…

Adam Doupé and the Shellphish team cheer from their seats in the Las Vegas Convention Center.

Jackpot! ASU hackers win $2M at Vegas AI competition

This August, a motley assortment of approximately 30,000 attendees, including some of the best cybersecurity professionals, expert programmers and officials from top government agencies packed the…