Skip to main content

A new approach to robotics

Innovative helper and assistive robots mark ASU’s celebration of National Robotics Week


A person works at a computer desk while a large robot made to look like a giant teddy bear stands in the background.
|
March 30, 2022

Today’s robots help us perform daily tasks at home and at work, they work with medical practitioners to overcome medical challenges and physical disabilities, and now they are transporting us through our daily lives on roadways and in the air.

National Robotics Week (April 2–10) is an annual event designed to showcase what’s new in robotics and inspire students to investigate STEM-related fields.

Robotics and augmented intelligence are components of all seven of the Ira A. Fulton Schools of Engineering at Arizona State University — the largest engineering program in the country. The following are just a few examples of robotics innovation taking place at ASU.

Helper robots

Video by Ken Fagan/ASU News

Students can always find a robot hug in Assistant Professor Heni Ben Amor’s Interactive Robotics Laboratory, where the focus is on human-robot interaction, robot autonomy and machine learning. The robot is learning how to interact with humans on a one-on-one, physical basis, and perhaps teaching humans that robots can have a calming role to play, too.

Suckermouth catfish were the inspiration for the swimming robots in Assistant Professor Daniel AukesIDEA Lab. The fish-inspired robot for extreme environments, or FIRE robot, can navigate through tight spaces in canals and waterways. Like the scavengers that are catfish, FIRE’s future is as a cleanup robot.

Narrow, cluttered spaces are the domain of specialized drones in Associate Professor Wenlong Zhang’s Robotics and Intelligent Systems (RISE) Lab.  Able to mimic bird-like flight, these drones can squeeze their bodies to fly through narrow spaces and then expand to full flight capacity.

Auto manufacturers are moving quickly forward with autonomous technologies, and the team in Professor Aviral Shrivastava’s Make Programming Simple Lab are hastening the process. Creating a broad-based platform through which different manufacturers’ vehicles can speak with each other, humans and the environment will lead to safe, autonomous vehicle decision-making. 

Assistive robots

Video by Ken Fagan/ASU News

Children with cerebral palsy and stroke victims regaining full shoulder mobility are two of the recent projects in Associate Professor Hyunglae Lee’s Neuromuscular Control and Human Robotics Lab.  For the cerebral palsy platform, Lee’s team uses human-robot interaction to improve a participant’s gait – data from the unaffected leg is used to adjust the platform and improve the neuromuscular response of the impaired leg. 

Shadowy robotic figures create situational awareness in Assistant Professor Yu (Tony) Zhang’s Cooperative Robotic Systems Lab. The system allows a project manager to monitor what robots are doing in the background via virtual, non-interruptive shadows. Someday, medical practitioners may have goggles to see what a robotic assistant is doing in the background.

New robotics technology for interventional radiology may soon improve procedures to insert needles, catheters and neurostimulators by using a robot arm pulling a magnet-guided device through a vascular system or tissue. The technique developed in Associate Professor Hamid Marvi’s Bio-inspired Robotics, Technology and Healthcare (BIRTH) Lab not only limits radiation exposure, but the magnetic pulling process can avoid bone obstructions or perforating an artery. 

Originally designed to help U.S. Air Force aerial porters load pallets and lift cargo onto jets, Professor Thomas Sugar’s Aerial Port Exoskeleton, or APEx, will offer significant assistance to workers in a variety of industries, from warehousing to manufacturing to loading luggage on commercial flights to assisting postal workers. Created in the Human Machine Integration Lab with hip stabilizers and elements that provide robotic assistance for lifting, pushing and pulling, the device will also reduce workforce injuries. 

Top photo: Computer sciences master’s degree student Michael Drolet prepares the "hugging robot" in ASU Assistant Professor Heni Ben Amor’s Interactive Robotics Laboratory on Oct. 23, 2020. Ben Amor’s lab work focuses on machine learning, human-robot interaction, grasping manipulation and robot autonomy with the hugging robot "learning" to interact with humans. Photo by Deanna Dent/ASU News

More Science and technology

 

A portrait of Kookjin Lee sitting on a bench with a laptop

New AI for a new era of discovery

As the legend goes, in 1665, Sir Isaac Newton sat in his garden at Woolsthorpe Manor in England and looked on as a lone apple dropped from a tree branch, falling straight down. This chance encounter…

A glowing pentagon on a maroon background

ASU receives 3 awards for research critical to national security

Three researchers in the Ira A. Fulton Schools of Engineering at Arizona State University have received grant awards under the Defense Established Program to Stimulate Competitive Research, or…

JWST and HST

Celebrating 34 years of space discovery with NASA

This year, NASA's Hubble Space Telescope (HST) is celebrating its 34th anniversary of the world's first space-based optical telescope, which paved the scientific pathway for NASA's James Webb Space…