ASU program will combine creation of technology with literature, sociology
Editor's note: This story is being highlighted in ASU Now's year in review. Read more top stories from 2019.
Artificial intelligence algorithms have become pervasive in daily life, but should they? And what are the drawbacks and advantages of using machine learning?
Several Arizona State University faculty members have won a grant from the National Endowment for the Humanities to create a new curriculum that will challenge students to think about these complex issues while they’re learning how to create the technology.
The grant is funding a yearlong process for the School of Arts, Media and Engineering to create the new program, which will be a concentration within the existing Bachelor of Arts in digital culture. The school is housed in both the Herberger Institute for Design and the Arts and the Ira A. Fulton Schools of Engineering.
Suren Jayasuriya, an assistant professor jointly between the School of Arts, Media and Engineering and School of Electrical, Computer, and Energy Engineering, is the project director for the grant and said the program will be integrative.
“You could go to a history or philosophy department or English lit department and learn about artificial intelligence. Or you could go to a computer science department and learn how to build AI,” he said.
“But this program is unique in trying to bring it together.”
Jayasuriya works in computer vision, which is applying artificial intelligence to visual media.
“It’s things like how computers recognize objects and images, how they analyze images to understand object shapes, location and what an object’s utility is,” he said.
“It’s being used for self-driving car technology.”
Jayasuriya also has an interest in philosophy and literature and co-taught the “Prototyping Dreams” course with Ed Finn, associate professor in the School of Arts, Media and Engineering and the Department of English and founding director of the Center for Science and the Imagination.
“So we came up with this idea of developing a curriculum for the AME program that meets this dual need of both teaching students about the underlying technology, like what is and isn’t possible, but also the social-cultural knowledge behind AI,” Jayasuriya said.
In the grant proposal, the reading list for the courses includes classics from Isaac Asimov, including “I, Robot,” and content that builds on ASU’s Frankenstein Bicentennial Project to highlight creativity and responsibility. Besides Jayasuriya and Finn, the other faculty involved in creating the new program are Pavan Turaga, associate professor in the schools of Arts, Media, and Engineering and Electrical Engineering and director of the Geometric Media Lab, and Xin Wei Sha, professor and director of the School of Arts, Media and Engineering and director of the Synthesis Center.
Jayasuriya answered some questions from ASU Now about creating the new program.
Question: In the grant proposal to the National Endowment for the Humanities, the team mentions the “anxiety and fear” around artificial intelligence, and specifically cites the “Terminator” movies. Do you hope to address these negative perceptions?
Answer: There are a couple of tropes that get amplified.
I’m not saying that it’s positive or negative, but we noticed this storytelling and we wanted to develop a curriculum that gives voices to some of those stories but also other stories.
Basically for the National Endowment for the Humanities proposal, part of my goal was, if you want to have humanities students for the 21st century who are going to deal with these technologies in their society and their workplaces and their lives, how do you effectively train them for this emerging domain?
Q: How much technology will the students learn?
A: We want to introduce them to some basic technology so they can creatively think and design in AI spaces but it’s not necessarily the goal for them to build a state of the art AI system, although we wouldn’t discourage that.
Each course will be designed to build them up to speed on programming and data handling and other things you would need to deal with AI tech, but the focus is to give them a broader education.
Q: So the graduates won’t be the people who are doing the programming but will be the people working with the programmers?
A: That’s one option. Or digital journalism or marketing. Even human resources will be affected by AI technologies. A lot of HR companies are using AI systems to help with recruitment and intake and you will need knowledge of how that system is working and of its hidden biases.
Q: What kind of biases?
A: It’s interesting to talk about how AI reflects social inequities.
It can be pernicious. AI systems learn on large data sets and the data sets reflect the social inequities in society, so the networks are implicitly learning these biases, like AI systems for determining insurance rates.
So “Terminator” and robotics is the stuff of popular narratives but some of these narratives are less well known but affect the world equally as much.
Q: What’s the Prototyping Dreams course?
A: Prototyping Dreams is a required four-week module in our digital culture program.
Ed Finn and I co-taught it and focused on prototyping development to help with storytelling. One module we did was on minds and machines and we had the students build a working chatbot in a Python programming language starting from scratch.
We read works from Descartes and John Searle, and we read about the Turing Test and something called the Chinese Room Argument.
We had students reflect on AI in society and at the end we did a gallery exhibition at the Tempe Center for the Arts and people came and got to interact with the students’ chatbots.
So we’re going to build that out into a full course extending beyond chatbots to visual media and other kinds of algorithms.
Q: What are faculty in the School of Arts, Media and Engineering interested in?
A: They’re interested in new technologies and how they can be used for media arts and sciences, and that takes various forms, such as visual media like images, projections, video feedback and virtual reality and augmented reality systems.
It could be audio. A lot of our faculty build new types of musical instruments and musical audio interfaces.
I’ve taught courses on computational cameras — rethinking what a camera is. So building new types of cameras that see underwater, or around corners or see through fog or smoke or see things that are not generally visible but using new types of optics, signal processing and AI to be able to do that.
I’ve taught Understanding Activity, which is designed to create experiential media systems to interact and give feedback with a user using audio, visual and touch.
The general cohesion of AME is the use of technology in the media arts and sciences, and the reasons for why you would do that. We think about creative practice and design skills and philosophy and stories we tell using that technology.
Top image by Pixabay