IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Preparing K-12 and higher education IT leaders for the exponential era

AI on Campus: Rethinking the Core Goals of Higher Education

Instructors are evaluating how artificial intelligence impacts the main goals of education and adjusting their teaching accordingly. This leads to conversations about critical thinking and changing workforce expectations.

An instructor at the front of a dark lecture hall by a screen with students watching in the foreground.
Adobe Stock/Gorodenkoff
For many professors, teaching has always been about more than delivering subject-specific content. Derek Bruff, director of the University of Virginia’s Center for Teaching Excellence, said the core mission of college is to help students develop critical thinking, problem-solving and judgment skills that prepare them for life beyond the classroom.

But with artificial intelligence offering such a convenient tool to offload those skills, professors are re-evaluating how they approach their goals, sending ripple effects to instruction, assessments and student interactions.

“I can’t recall another technology in my career that has had such a transformative effect on higher-ed teaching and learning,” he said.

WHAT MAKES A PREPARED GRADUATE?


Instructors agree that one of higher education’s core tenets is producing graduates who are ready to enter their chosen workforce. AI skills are a growing expectation across industries, and instructors are balancing teaching how to use AI with building the expertise needed to assess it.

Colin Campbell, a professor of physics and astronomy at the University of Mount Union in Ohio, said sometimes it is appropriate to ask students to pretend AI doesn’t exist, but instructors can’t slip into full denial.

For example, he still asks students to hand-draw vectors to understand how forces interact. He has also gone back to administering more handwritten quizzes with simpler problems to help confirm students have a strong foundation before interacting with AI.

“If you’re like, ‘Oh, AI will do it,’ Well, who’s going to check the AI?” he said. “And what about when AI comes to a fringe case?”

This suspension of disbelief around AI’s ability to quickly solve simple problems only works if students will learn to use AI strategically as they progress. For example, a lower-level student might be asked to use AI to generate code and create a simulation of a complex physical system, and their work lies in showing their understanding of the simulation. However, for a more advanced student, it may be appropriate to have them use AI as a companion in creating that code themselves to show a grasp of computational physics. Both require a baseline understanding of the physical principles at play.

Some of an instructor’s work these days is helping students understand why it is important to walk the path and not just skip to the destination. Campbell said he talks with students about how relying on AI will hinder critical thinking abilities in the long run.

“If all you know how to do in the workforce is how to feed something into a chatbot and then take the output and say, ‘Here it is,’ then you are not employable,” Campbell said.

Asao Inoue, a professor of rhetoric and composition at Arizona State University, said humanities face a parallel challenge. Courses can involve reflective writing and argumentation outside of the classroom, making it easier for students to outsource work to AI. This makes those conversations about why not to use it all the more important.

He tells students, “Nothing ever comes fast that’s worth keeping or that improves the human condition.”

However, he understands that AI is so enticing that just saying “don’t use it” is not enough. Inoue tries to acknowledge the pressures that exist in students’ lives — jobs, heavy course loads and familial responsibilities — that influence AI reliance.

At the beginning of his course, he asks students to write a few ethical promises around AI. Throughout the course, he has students recertify their promises and reflect on how well they lived up to them with each assignment.

His assignment design also naturally discourages overuse of AI. He requires all of the work that went into creating a finished product, including notes, brainstorming and early drafts. This, he said, makes it about as easy to do the assignment as it would be to use AI on it.

While he doesn’t use AI to build his own courses, he thinks it can be helpful to students for providing more immediate feedback than a professor might, or higher-level feedback than a peer might.

Uma Ravat, who teaches probability, statistics and data science at the University of California, Santa Barbara, said she uses AI outputs intentionally in class, highlighting errors for students to spot and discuss.

“I can actually take a screenshot of it and use it in class to have a discussion about, ‘Is this right? Is this wrong?’” she said. “What assumption did AI make to get to this answer which is incorrect? I feel like that discussion, we are having it earlier, rather than later, and it helps build critical thinking skills.”

By making AI’s unpredictability a learning opportunity, students witness real-time problem solving and develop skills in critically evaluating AI-generated solutions.

She also leverages AI to create interactive activities for large classes, allowing her to create small, targeted exercises sprinkled throughout lectures.

INSTITUTIONAL SUPPORT AND CURRICULUM REDESIGN


Bruff said professors’ individual reflection like this is common.

“Right now, what we’re seeing is a lot of individual faculty making choices,” Bruff said. “But I think what we need to see more of is some mix of those individual choices and the collective choice, the collective designing that needs to go on to make curricula work well.”

While majors already undergo occasional redesigns, Bruff said they take place every 10 or 20 years and usually do not drastically change. Some schools have taken another look at their programs specifically to address AI. At Mount Union, for example, faculty were offered the chance to redesign a single course or an entire program to integrate AI.

Taking on a more top-down approach can be helpful for ensuring students see AI skills build alongside the skills they learn in their chosen major. That way, when students are asked to avoid AI early on, they can see where they are headed.

However, Bruff said, institutions shouldn’t rush into program overhaul.

“We don’t know what the target is yet,” he said. “The technologies are changing … so it’s hard to know what the target is for some type of redesign effort, but I think it’s important that we go there.”

Course and curriculum updates could come at a more measured pace if baked into existing reflection processes.

“I think there has to be some level of expectation and support and guidance for that,” Campbell said. “But if a university or college is doing the things that they ought to be doing, then absorbing AI or addressing AI becomes part of that continual process.”
Abby Sourwine is a staff writer for the Center for Digital Education. She has a bachelor's degree in journalism from the University of Oregon and worked in local news before joining the e.Republic team. She is currently located in San Diego, California.