ASU engineering students invited to Intel Cup competition in China


April 23, 2014

A team of students from the Ira A. Fulton Schools of Engineering has been invited to present a project at the Embedded System Design Invitational Contest, a part of the Intel Cup Undergraduate Electronic Design Contest. The competition is scheduled for July 15-20 in Shanghai.

The Fulton team will present a new robot that can be used to teach introductory students the fundamentals of programming in an easier, more engaging and less-expensive manner. The robot is based on the latest Intel processors, Bay Trail and Quark for embedded applications. two ASU engineering students talking with a robot on a table Download Full Image

Held every two years, the Intel Cup competition is part of Intel’s commitment to education, and is designed to showcase the use of embedded systems using the latest Intel processors, said Yinong Chen, a senior lecturer in the School of Computing, Informatics and Decision Systems Engineering, one of the six Ira A. Fulton Schools of Engineering. Chen was a team coach and judge at the last competition, and will be a judge again this year.

An embedded system is a combination of computer hardware and software specifically designed for a particular function. Embedded systems are used in objects such as phones, cameras, airplanes, household appliances and toys.

Invitations to the Intel Cup are extended to top research universities that have ongoing collaborations with Intel, Chen said. Last time, 160 teams from across the globe participated. Eight were given first-place honors and 16 received second-place honors. ASU’s team was in the second category.

Teams are asked to design, implement and document a working prototype of an embedded system, but there are no set parameters on what the system can do.

Last time, participating teams designed embedded systems that could recognize sign language, a shopping cart that could follow the shopper and a helicopter that could fly as well as drive on water, among other things. ASU’s team implemented an Ad Skip system that could recognize and skip advertisements when connected to a TV or video device.

In addition to providing the participating universities the necessary hardware, software and training for competition, Intel provides additional devices for teaching. This year, ASU received 35 Intel Galileo boards for regular teaching.

“We are the first group to get this next-generation equipment,” Chen said.

Intel Intelligent Systems Group, based in Chandler, is one of the largest employers of Fulton engineering graduates, Chen said. The company works closely with ASU to help ensure the university’s computer science and engineering courses are relevant to the industry.

This year, two teams are working together as part of the Computer Systems Engineering capstone course. They are creating a robot that students can use to learn programming skills. The software team is creating a user interface and a simulator, which will allow novice students to program a robot without knowing how to write computer code. Members of that team are David Humphries,Garth Bjerk, Ian Plumley, Nathanael Stout and Tracey Heath, all computer science majors in the School of Computing, Informatics and Decision Systems Engineering.

The hardware team is using the Intel processor to build a robot and write embedded code in the robot. Members of that team are Corey Jallen, Matthew Recchia, Randy Queen, Rizwan Ahmad – who had been awarded the James F. Golder Memorial Scholarship – and Stephen Pluta, all computer systems engineering majors in the School of Computing, Informatics and Decision Systems Engineering.

They will go to Shanghai to present the project. The competition team is coached by Yann-Hang Lee, a professor in the School of Computing, Informatics, and Decision Systems Engineering. Lee also teaches multiple ASU classes using the latest Intel architecture and devices.

Garret Walliman, a graduate student pursuing a master’s degree in computer science education, is working with both teams as part of his thesis. He will create an integrated system for computer science education.

Walliman said the graphical user interface makes learning easier.

“It allows students to learn the concepts of loops and ‘if’ statements without having to learn computer programming languages like Java or C++,” Walliman said. “It uses programming blocks that students can put together and then send to a simulator to test, or send directly to a robot. Keeping the syntax simple allows them to focus on the concepts.”

The simulator shows a tiny robot in a maze, and the students drag and drop the programming blocks into a window to tell the car to move forward a certain distance, turn right and move forward again. The maze gets more complex as the student better understands the programming concepts.

“In the first course, students learn Java and C++, but it’s difficult and boring,” Chen said. “They get discouraged and think they don’t like programming. But when you use the graphic drag-and-drop program, which keeps it very simple and puts the programming in context, students really understand it.”

The hardware team designed a robot that the students can use to test the programming. It can be used for the 100-level computer programming class that all engineering students are required to take.

Currently, students use a Lego robot that costs about $350 and uses Lego’s closed architecture, which is proprietary and cannot be viewed or altered by the students.

The ASU-designed robot costs $150, about as much as a college textbook, and uses open architecture, which will allow students to see and learn more about how everything works together.

It would allow students to purchase a robot for the 100-level course and use it for later courses, too.

“It is an open, reusable architecture for multiple courses,” Chen said.

The team is testing the system with students in the freshman class, FSE 100. Walliman said the program could be used for middle-school students up to college students.

“I've worked with people who had significant trouble learning to program because they didn't understand the fundamental concepts,” Walliman said. “They don't know how to design and build algorithms. Instead, their biggest takeway from FSE101 is: 'God help you if you forget a semicolon!'"

At the same time, Walliman said learning to get “Hello World” to show up on your computer screen isn’t always a big motivator for students, so working with a robot can make it more fun and keep them engaged.

Walliman said the program can be used in schools or summer camps, and even could become a phone app game, like Angry Birds, that would allow kids to play and learn.

Walliman hopes it eventually could become a start-up business that would improve the way computer programming is taught.

Brain control: Taking human-technology interaction to the next level


April 24, 2014

Modern military defense planning is already heavily focused on how to gain strategic advantage through brainpower. Another significant step in that direction could result from an Arizona State University engineer’s new research on using cognitive abilities to control defense operations in more direct ways than ever.

Panagiotis Artemiadis is exploring the potential for effective control of technology “simply by thinking.” two people interacting with robotic arm Download Full Image

His project serves the growing needs of the U.S. Air Force for more advanced “mixed human-machine decision-making,” as described by the Air Force Office of Scientific Research.

The agency wants the research to provide methods and models for developing “an actionable knowledge framework between humans and multi-agency systems.” For instance, a system enabling direct communication from an individual’s brain to a squadron of unmanned semi-autonomous aircraft.

Artemiadis was recently awarded a grant of $360,000 from the Air Force’s Young Investigator Research Program, which seeks to put younger scientists and engineers “who show exceptional ability and promise for conducting basic research” to work helping the Air Force solve its technological challenges.

His project was one of only 42 selected from more than 230 proposals for funding from the program.

Artemiadis is an assistant professor of mechanical and aerospace engineering in the School for Engineering of Matter, Transport and Energy, one of ASU’s Ira A. Fulton Schools of Engineering.

His expertise is in the field of robotics and control systems, focusing human-oriented robotics, ranging from prosthetics and exoskeletons to rehabilitation and assistive robotic devices.

He is the director of ASU’s Human-Oriented Robotics and Control (HORC) Lab and the editor-in-chief of the research journal Advances in Robotics & Automation.

For his Air Force project, he will explore the brain’s perceptive and predictive capacities to assess its ability to perform effectively in “human-swarm closed-loop” communication and control systems.

At a fundamental level, he is seeking to better understand “the mechanisms involved in how the brain perceives information it receives from observing moving multi-agent systems.”

In this case, “swarms” and “multi-agent systems” refer to multiple robotic, autonomous vehicles in motion, primarily aircraft.

“We are going to look at people’s brain signals while they are watching swarms, and understand how the brain perceives high-level information from these swarms,” Artemiadis explains.

The goal is to find out if individuals can reliably maintain a high level of cognitive performance in coordinating the movements and actions of a swarm. The controllers would need to prove themselves capable of effectively strategizing both reactively and proactively in high-pressure, high-risk situations.

Such a closed-loop, brain-machine interface system would integrate the observational capacities of the human and the machines. “It’s a mixture of your perception and the machines’ perception,” he says.

While this mix of perceptions would be accomplished through a system involving human-computer interaction, the computer functions simply as a conduit for communications between machine and an individual’s cognitive activity.

“You would not have to send a command by typing on a computer keyboard or by voice command,” Artemiadis says. “You literally would just have to think about it.”

The idea sounds “a little like science fiction,” he adds. “It is beyond the human-technology interaction systems we have at the moment.”

Artemiadis will be able to give at least two ASU engineering doctoral students opportunities to assist him in the research.

Joe Kullman

Science writer, Ira A. Fulton Schools of Engineering

480-965-8122