Adapting higher ed to AI
W. P. Carey information systems expert asks students to embrace the power of people
Stock photo
Arizona State University's Geoff Pofahl, a clinical assistant professor ofinformation systems, has used generative AI to enhance teaching and learning for years.
Whether experimenting with new ways to help instructors streamline course preparation and grading, or guiding students on how to ethically use technology on assignments and in the classroom when appropriate, Pofahl knows AI is a powerful tool.
But it's a tool that has a time and place.
"I communicate to students what AI they can use and when," says Pofahl, "but I've also been consistent about my expectations: They need to look for opportunities to let their humanity shine through, whatever the end delivery happens to be."
Students today are more than just getting comfortable with using generative AI; some are highly proficient. So, this fall, instead of focusing on how he can help students integrate AI into their learning, Pofahl is trying something new.
"Some colleagues and I agreed that a major problem in higher education is that we focus almost exclusively on the product or outcome," says Pofahl. "If I give students instructions to turn in an assignment, it could all be AI generated. Instead, we need to find better ways to evaluate processes."
This year, Pofahl is making several changes to his curriculum that will encourage AI use while emphasizing the importance of deep learning and human expression.
One way he's doing this is by implementing unscripted, self-recorded presentation assignments. Students can use AI tools to build and enhance their presentations, but they must talk through their thought process or apply information learned in class. That way, Pofahl can verify that students are grasping the course material and not relying solely on technology to complete their assignments.
"They take on the role of the teacher and can discuss a topic we covered in class, or extend it to an industry we didn't cover," he says. "It forces students to think about the course materials, which creates connections in their mind that help the material stick. And that's a skill we're going to need more and more in an AI-driven world."
Pofahl first decided to require that presentations be unscripted after noticing students were relying heavily on AI to write their scripts, which were rife with "glittering generalities" or business lingo with no data or specific examples to support the phrase.
Oftentimes, Pofahl explains, these generalities aren't incorrect, but indicative of a student who is relying too heavily on AI to complete their assignments. By requiring students to substantiate those claims verbally in their presentations and by citing their sources, Pofahl hopes to encourage a deeper dive into the coursework.
Since students will use AI in their work and personal lives far beyond graduation, Pofahl's reasons for updating his student evaluation processes go far beyond the classroom.
"Everything that they produce is going to be viewed through a lens of: Could AI do that? Could AI do this better?" he says.
There is a novel use for AI every day, and Pofahl anticipates that it will tempt some managers and corporations to substitute AI for human employees. The possibility of being replaced by AI makes the "human" element of students' work all the more important.
His advice to students is to remember that just because something is easy doesn't mean it's good. And oftentimes, the most tedious and cumbersome parts of a project — when students feel the strongest temptation to use AI — present the best and most enjoyable opportunities to make a human impact on the audience, whether through storytelling, design choices or the inclusion of personal anecdotes.
"For me, those processes aren't tedious because I'm thinking about the delivery, about the audience," says Pofahl. "There's a reframing that needs to happen, and maybe it could reduce the temptation for students to push the easy button every time they can."
And for those who aren't using AI in their work, Pofahl says it’s still a problem if their assignments sound as though AI wrote them.
"At some point, someone will look at your work and compare it to AI. And if your work looks like AI — even if it isn't — they will assume that it is. Is that the impression you want to make?”
Learn more about AI as a resource at ASU.