ASU’s AME program receives National Science Foundation grant for real-time motion analysis
TEMPE, Ariz. – Spearheading one of the nation’s few research initiatives to blend the work of artists, engineers and other experts, Arizona State University has received a $1.4 million grant from the National Science Foundation to study real-time motion analysis.
This is the first large-scale NSF grant for the year-old Arts, Media and Engineering graduate program and it follows a National Endowment for the Arts grant – ASU’s largest – for the ground-breaking motion analysis collaborative. Over the next five years researchers hope for promising results in areas that range from development of gait-recognition security to spinal rehabilitation, from dance choreography to robotics.
Arts, Media and Engineering is a joint program of ASU’s Herberger College of Fine Arts and the Ira A. Fulton School of Engineering. It governs the Interdisciplinary Research Environment for Motion Analysis (IREMA) program in which nearly 60 faculty from 10 disciplines ranging from computer science and bioengineering to dance and psychology share in teaching and researching motion analysis and maintaining the motion-analysis labs that the NSF grant will support.
Over the past decade, human motion analysis has become an important research area with critical applications. But it is a complex problem due to the 3D nature of the human body and the multiple levels of movement in terms of time, space and energy. Progress has been slow because disciplines have been addressing the issues individually.
ASU’s motion analysis initiative “can serve as a new model for research and interdisciplinary collaboration, which can be adapted to other areas” to improve productivity, said the NSF in awarding the grant.
“ASU’s AME program is supporting artists and engineers who are pushing the limits of technology and creating new applications that are considered cutting edge in the worlds of art and science,” says Thanassis Rikakis, director of the AME program. “We are proud to be considered among the top initiatives of the NSF and the NEA.”
The entertainment, biodesign, security and military industries are continuously expanding their use of motion capture and analysis. But most motion analysis is not done in real time, restricting the ability for the instantaneous evaluation, feedback and correction that characterizes movement learning in the real world.
“When we perform daily movements, like reaching for a glass of water we receive instantaneous and continuous feedback from our body and our environment that enable us to actually reach the glass and bring it to our mouths,” Rikakis explained. “When you are trying to rehabilitate patients with certain neurological diseases, a motion-analysis system can evaluate their movements and offer corrective feedback in real time, becoming an effective substitute for their impaired senses.”
Ultimately, the technology will exist in many real-life environments such as hospitals, classrooms, stadiums, performance halls and airports. Applications may include movement rehabilitation; movement recognition for security purposes; technologies that encourage active learning in K-12; movement training for the fields of dance, theatre, sports, firefighting and the military; and, movement enhancement for robotics or other human-computer interaction.
The NSF grant will enable AME to take its research to the next level, helping its research teams capture human movement in its full essence and enhancing interactive, real-time feedback capabililities.
The lead project investagator is Gang Qian, who joined AME as a joint appointment from the Herberger College and Fulton’s Department of Electrical Engineering last year. The co-leading faculty are Thanassis Rikakis (AME), Todd Ingalls (AME), Jodi James (AME), Sethurahman Panchanathan (Computer Science & Engineering), Jiping He (Bioengineering) and Michael McBeath (Psychology).
The project is funded by an NEA technology grant and a separate NSF grant to develop media flow architectures (real-time control of audio, video and lighting on an intelligent stage) for arts performances.
For more information about the AME visit http://ame.asu.edu. For the NSF announcement visit:http://www.nsf.gov/awardsearch/showAward.do?AwardNumber=0403428.
Media Contact:
Mica Matsoff
(480) 965-0478
Mica.Matsoff@asu.edu