Skip to main content

A new approach to robotics

Innovative helper and assistive robots mark ASU’s celebration of National Robotics Week


A person works at a computer desk while a large robot made to look like a giant teddy bear stands in the background.
|
March 30, 2022

Today’s robots help us perform daily tasks at home and at work, they work with medical practitioners to overcome medical challenges and physical disabilities, and now they are transporting us through our daily lives on roadways and in the air.

National Robotics Week (April 2–10) is an annual event designed to showcase what’s new in robotics and inspire students to investigate STEM-related fields.

Robotics and augmented intelligence are components of all seven of the Ira A. Fulton Schools of Engineering at Arizona State University — the largest engineering program in the country. The following are just a few examples of robotics innovation taking place at ASU.

Helper robots

Video by Ken Fagan/ASU News

Students can always find a robot hug in Assistant Professor Heni Ben Amor’s Interactive Robotics Laboratory, where the focus is on human-robot interaction, robot autonomy and machine learning. The robot is learning how to interact with humans on a one-on-one, physical basis, and perhaps teaching humans that robots can have a calming role to play, too.

Suckermouth catfish were the inspiration for the swimming robots in Assistant Professor Daniel AukesIDEA Lab. The fish-inspired robot for extreme environments, or FIRE robot, can navigate through tight spaces in canals and waterways. Like the scavengers that are catfish, FIRE’s future is as a cleanup robot.

Narrow, cluttered spaces are the domain of specialized drones in Associate Professor Wenlong Zhang’s Robotics and Intelligent Systems (RISE) Lab.  Able to mimic bird-like flight, these drones can squeeze their bodies to fly through narrow spaces and then expand to full flight capacity.

Auto manufacturers are moving quickly forward with autonomous technologies, and the team in Professor Aviral Shrivastava’s Make Programming Simple Lab are hastening the process. Creating a broad-based platform through which different manufacturers’ vehicles can speak with each other, humans and the environment will lead to safe, autonomous vehicle decision-making. 

Assistive robots

Video by Ken Fagan/ASU News

Children with cerebral palsy and stroke victims regaining full shoulder mobility are two of the recent projects in Associate Professor Hyunglae Lee’s Neuromuscular Control and Human Robotics Lab.  For the cerebral palsy platform, Lee’s team uses human-robot interaction to improve a participant’s gait – data from the unaffected leg is used to adjust the platform and improve the neuromuscular response of the impaired leg. 

Shadowy robotic figures create situational awareness in Assistant Professor Yu (Tony) Zhang’s Cooperative Robotic Systems Lab. The system allows a project manager to monitor what robots are doing in the background via virtual, non-interruptive shadows. Someday, medical practitioners may have goggles to see what a robotic assistant is doing in the background.

New robotics technology for interventional radiology may soon improve procedures to insert needles, catheters and neurostimulators by using a robot arm pulling a magnet-guided device through a vascular system or tissue. The technique developed in Associate Professor Hamid Marvi’s Bio-inspired Robotics, Technology and Healthcare (BIRTH) Lab not only limits radiation exposure, but the magnetic pulling process can avoid bone obstructions or perforating an artery. 

Originally designed to help U.S. Air Force aerial porters load pallets and lift cargo onto jets, Professor Thomas Sugar’s Aerial Port Exoskeleton, or APEx, will offer significant assistance to workers in a variety of industries, from warehousing to manufacturing to loading luggage on commercial flights to assisting postal workers. Created in the Human Machine Integration Lab with hip stabilizers and elements that provide robotic assistance for lifting, pushing and pulling, the device will also reduce workforce injuries. 

Top photo: Computer sciences master’s degree student Michael Drolet prepares the "hugging robot" in ASU Assistant Professor Heni Ben Amor’s Interactive Robotics Laboratory on Oct. 23, 2020. Ben Amor’s lab work focuses on machine learning, human-robot interaction, grasping manipulation and robot autonomy with the hugging robot "learning" to interact with humans. Photo by Deanna Dent/ASU News

More Science and technology

 

Portrait of Meenakshi Wadhwa

ASU planetary scientist to be inducted into the National Academy of Sciences

The National Academy of Sciences is inducting School of Earth and Space Exploration Director Meenakshi Wadhwa into the 2023 class of new members for her pioneering work in planetary sciences and…

Adam Cox speaks to an unseen audience, sitting next to another person in a suit

Unlocking the potential of AI for homeland security

“Can we do what we're doing now cheaper, more efficiently, more effectively?” Adam Cox, director in the Office of Strategy and Policy at the Department of Homeland Security Science and Technology…

A large group gathered for a photo with ASU signage behind them

SpaceHACK highlights student solutions to environmental challenges, digital divide

By Adrianna Nine About 250 students from around the world convened online and at Arizona State University on March 22 for the ASU Interplanetary Initiative’s second annual SpaceHACK for…