The role of AI in higher education

ASU faculty members share their thoughts on how AI can shape education today and in the future


AI generated image of people looking at swirling blue technology in the air

In recent decades, the advent of machine learning and neural networks — among other key advancements — to build artificial intelligence systems that mirror human capabilities has dramatically shifted many landscapes, ushering in a new era of AI characterized by adaptability and learning capabilities.

Today, AI’s fingerprints can be found everywhere — from voice assistants in our homes to advanced research tools in our labs. Education is no exception. In the not-so-distant future, AI-driven personalized learning platforms are set to reshape the learning experience, using personal data to tailor content based on each student's needs. 

RELATEDThe journey to learn ChatGPT

We talked with three faculty members at ASU to gain insight into how AI is currently shaping education across the university and beyond.  

Speaking from both their unique and shared experiences in academia, three central themes emerged, along with specific insights to advance our understanding of AI tools in the classroom.

1. Guiding AI transformation with digital literacy

The promise of generative AI in higher education is compelling. It can bring abstract ideas to life through visual aids, enhancing teaching and learning. It can handle routine tasks, freeing educators to focus more on teaching and students' individual needs. 

“I think it’s a marvelously exciting time to be a student, an educator, a researcher or an artist because we have to fundamentally look at our basic assumptions and ideas about how we interact with each other and our environment,” says Lance Gharavi, a professor in the School of Music, Dance and Theatre, part of the Herberger Institute for Design and Arts.

The challenge, Gharavi points out, is ensuring that we actively guide this transformative process.

“We have to ask ourselves why we’ve been doing things a certain way for decades, and maybe rethink our approach — and that is both exhilarating and anxiety producing.”

But where do we draw the line?

“There’s a problem with the limiting principle — a rule or guideline that tells us when to stop or how far to go with something,” Gharavi says. “On the one hand, we have something as simple as artificial intelligence like spell check, and on the other, we have something as sophisticated as ChatGPT — it’s a complicated conversation. Because if we’re not acknowledging the messiness, we’re not being honest.”

All three faculty members underscored the importance of teaching digital literacy alongside the integration of AI tools. Digital literacy serves as a compass of sorts, guiding users to ensure that AI tools do not inadvertently strengthen biases or become channels for malpractice and propaganda.

“From the very beginning, propaganda has been a part of our nation's fabric," says Retha Hill, director of the New Media Innovation and Entrepreneur Lab at the Walter Cronkite School of Journalism and Mass Communication"But today, we're seeing persuasive, AI-generated visuals accompanying targeted messages.

“AI can fabricate news articles filled with plausible-sounding narratives and fabricated quotes, and people tend to believe them. Now more than ever, understanding and discerning the credibility of content is more crucial than ever.”

2. A closer look at embedded bias 

The faculty members also emphasized the need for diverse input, continued development and regular audits to ensure AI tools are not perpetuating biases — but it’s not that simple. Intelligent learning machines, like people, are inherently complex.

“What we see now in generative AI is an offshoot of a whole generation of researchers trying to solve what is called the object recognition problem, where the goal was to take a picture, detect objects and give it a name and category,” says Pavan Turaga, director and professor in the School of Arts, Media and Engineering.

AI technology today is the result of many researchers aiming to teach computers to recognize and categorize content, Turaga says, but issues occur when this technology is used to generate images of people, especially when those people fall into uncommon categories.

To showcase this, Turaga asked the generative AI system DALL-E, which uses AI technology to generate high-quality digital images from text prompts, to create an image of a “non-binary math teacher in a classroom.” Turaga, who uses they/them pronouns, said it produced an image that is not accurate or fair, likely because the AI’s learning examples for this category are limited and may not represent the true diversity of the category.

“Human beings always spill over into buckets, like race is not cleanly defined into six buckets — there are mixed-race people; gender is not two buckets or even three buckets — it's considered a spectrum,” Turaga said. “So human attributes often defy the notion that things can be in nice little buckets. So how do you get past that? That's the big question.”

3. Fostering opportunity and the future of AI  

So, just how much is generative AI set to reshape the landscape of higher education? And more specifically, how do we navigate AI transformations with integrity?

In the realm of content creation, Turaga sees AI not as a looming threat but as a tool for helping their students to stay competitive. 

“Content creation is going to be impacted in a different way than say a writing program,” Turaga says. “Our students are actively addressing AI tools, because we have to be at the cutting edge of content creation — no matter what the tool is — and right now it is AI.”

Hill says there is a need for transparency in the deployment of AI tools, especially regarding attribution. 

“It’s important we lead with transparency and we can accomplish that in part with proper attribution,” she said. “In the projects where we have used generative AI, we say that it was created through MidJourney, or Stable Diffusion, or whatever the tool is — so we're upfront about our attribution and sourcing.” 

But all three faculty members acknowledged AI’s potential to enhance teaching and learning, as long as we keep an eye on the ethical implications. 

“We find ourselves in a time that is both exhilarating and frightening. Change is scary and it's exciting,” Gharavi says. “I think the potential to radically transform higher education is really there, and we have to make sure that we are working vigorously to steer how that change happens.”

Written by Kevin Pirehpour, Enterprise Technology.

Top image generated using the generative AI platform MidJourney. Prompt: “Educators envisioning the future of AI-enhanced universities and society, with tech developments, suspended in air, swirling around." 

More Science and technology

 

Emily Williamson carries the gonfalon for the School of Computing and Augmented Intelligence down an aisle in a crowded auditorium full of seated graduates

Computer science school looks forward on heels of record-breaking graduation season

This spring, at two packed convocation ceremonies, a crowd of newly minted engineers ebulliently cheered under a rain of fireworks, balloons and confetti as the School of Computing and Augmented…

Large group of people pose for a photo at the top of steps leading up to an outdoor building at the Dedan Kimathi University of Technology campus.

Emerging machine-learning expert leads Kenya AI workshop

What if we already gather all the data we need to help us prepare for disasters, better plan our urban environments and protect our food supply — but we lack the tools to effectively analyze that…

Galaxy PJ0116-24, known as an Einstein ring

Telescopes in Atacama Desert capture extreme starburst galaxy warped into fiery ring

Ten billion years in the past, a rare population of extreme galaxies formed stars at rates more than 1,000 times faster than our own Milky Way galaxy. This was just a sign of the times; while…