IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Preparing K-12 and higher education IT leaders for the exponential era

AI Nudges Syracuse Professors Back Toward Blue Books, In-Class Work

To prevent students from relying on artificial intelligence to write and do homework for them, many professors are returning to pre-technology assessments and having students finish essays in class.

Graduation Cap Diploma and Blue Book Education Achievement Concept
Adobe Stock/dianagek
(TNS) — Oksana Korol remembers her parents telling her how they took oral exams in college. Professors would give them a problem to solve. They had 10 minutes to think it through, then explain their answer and reasoning.

“That is completely AI-proof,” said Korol, a biology professor at Onondaga Community College.

Korol, like countless other professors in Central New York, has had to change the way she teaches her class and assesses students’ learning due to the ballooning use of generative artificial intelligence. Many, like her, are relying on those old-school techniques to AI-proof their tests and papers.

As AI continues to push itself into our everyday lives, it has created a culture of do or die for higher education. Professors have to either adapt their classes or risk students using AI to avoid thinking through tough assignments. Universities can offer courses teaching students to think critically about AI or assume young minds already know how to ask tough questions.

If higher education as a whole does not adapt, it could be left behind as students embrace AI in their studies.

At the three largest educational institutions in Syracuse — Syracuse University, Le Moyne College and Onondaga Community College — many professors are returning to pre-technology assessments while students use AI to study in futuristic ways.

OCC and SU student handbooks outline how using AI to complete entire assignments and papers is considered plagiarism or academic misconduct. Le Moyne College does not have a college-wide AI policy, although Provost Jim Hannan said the institution is starting the process to create guidance.

Without uniform guidance from school leadership, it’s often up to professors to figure out how to handle AI. Some professors spend free time teaching themselves how to wield AI in a productive manner. Others side-eye the technology.

All of them, however, must understand how it impacts their students’ learning.

“We have to learn how to use (AI) first. That arms race is exhausting and overwhelming when our workloads, admittedly, are already pretty heavy,” said OCC professor Michelle Malinovsky.

IN-CLASS ASSESSMENTS MAKE A RETURN


When a student uses an AI prompt generator to write a research paper, there are signs the student didn’t do it themselves. Malinovsky said older AI versions used the same rhetoric model. Le Moyne English professor Ann Ryan said a student’s paper may be too clean.

There is no true way to prove a student used AI, unless the student accidentally leaves their prompt in the paper or admits to it, said Rubin.

While there are AI detection software, most don’t work, Rubin and Malinovsky said. For that reason, SU and OCC do not recommend using the programs.

“It’s incredibly inaccurate. It’s always reactionary and it causes a huge issue in terms of equity and bias,” Malinovsky said.

SU, Le Moyne and OCC either encourage or require professors to decide how they want students to use AI. They each have three similar pre-made policies professors can put into their syllabuses.

The policies state that AI cannot be used under any circumstance, students can use limited AI or AI can be used freely within reason.

“It’s a first step of allowing faculty to think about the use and letting students know how an individual faculty member plans to allow or not allow on assessment in the classroom,” said SU Chief Digital Officer Jeff Rubin.

With the traditional research paper becoming harder to assign, many professors are flipping the classroom. For example, Ryan has students writing in class and brainstorming with classmates.

“I’m workshopping papers in class as students are writing them. Traditionally in a literature class you’d say, ‘You got a 12-page paper due in three weeks,’” Ryan said. “So what we’re doing is we’re writing papers in class.”

Malinovsky, who teaches English and is also a librarian, said she has implemented similar practices. She encourages students to send pictures of handwritten assignments.

“(AI) challenges professors to rethink the assessment. Perhaps if they’re worried about AI use, then bring some of that assessment back into the classroom,” Rubin said.

Professors have also started to make work more reflective. For Ryan’s class, students had to add a one-to-two page coda on how what they learned in class impacted them to their final paper. They then shared that reflection with the class.

Learning assessments appear to be returning to traditional, in-person techniques. Korol said she has done away with online exams and implemented more in-class tests.

“I think that we’re back to the blue books and we’re maybe going back to the oral exams,” Korol at OCC said.

Even with the potential for misuse, AI can be a powerful tool if harnessed correctly. While assessments embrace older techniques, study strategies are flying forward.

Multiple professors said they see students upload class slides or paste class notes into a chatbot and ask it to create study tools like a practice quiz or study questions. Anthropic , the creator of AI bot Claude,analyzed student chatsand found they also use it to troubleshoot coding problems or synthesize complex information.

Korol has gone one step farther and created a prototype AI tutor, where she uploads questions for students to answer. The AI tutor knows not just the correct answer but the important ideas behind it, meaning students can express the answers in different ways.

“The tutor has been specifically told not to give out the answer, and instead it’s been told to keep asking leading questions. And if you read through this conversation, you will see that the hints are getting more and more transparent as we go along,” Korol said.

Korol has not yet used the tutor with students but is trying to figure out how to deploy the app for campus-wide use.

TEACHING ABOUT AI


SU is taking AI education one step further: it has begun offering classes and a minor about the impact of AI. Beginning in fall 2026, the Maxwell School of Public Affairs will offer the AI Policy minor. Classes include programming and public policy analysis.

The idea is not to teach students how to create or use AI, but rather to think critically about its real world applications, said program director Johannes Himmelreich.

“AI is going to have significant social impact. This will create policy challenges that require an understanding of the technology as well as the policy analysis skills,” Himmelreich said.

Malinovsky described generative AI as the “wild west.” There are no specific federal regulations on AI, with regulators instead relying on a patchwork of laws. On Dec. 11, President Donald Trump put out an executive order blocking states from creating their own regulations.

Himmelreich wants students to be able to tackle topics such as how to grow AI without destroying the environment, how AI could impact warfare or if AI could be conscious enough to deserve rights.

The idea of a minor exploring AI has been under discussion since 2020, a couple years before the public explosion of generative AI.

“ChatGPT wasn’t a technological surprise at all. I mean, I’ve been in demos that OpenAI gave for the older models that weren’t a chat interface. And the difference is that OpenAI changed its policy around how accessible they want the technology to be,” Himmelreich said.

Thinking critically about AI is a core part of Milton Santiago’s Generative AI Filmmaking class. He launched the three-credit course this past fall, teaching students how to use generative AI in filmmaking while also discussing its impact.

“We can intercept a lot of the ethical and moral questions that surround these tools currently, which actually I think is probably the biggest benefit of the class,” Santiago said. “Students do have to wrestle with questions of where does creativity begin and end? What does authorship actually look like? Is the environmental impact as this technology stands today worth using the tools?”

Teaching about AI is not only happening in classes or studies surrounding the topic. Malinovsky said she talks about AI a lot in her English classes. To help her students better understand why they need to do their own work, she sometimes has students do the work themselves then ask an AI chatbot to do the work. Then the students compare and discuss.

As generative AI continues to develop, faculty are learning how to use it along with students. Some, like Ryan, are skeptical of its application and try to avoid it while others, like Santiago , spend hours figuring out the best application for students.

“This has shaken up what education does in a way that other technologies haven’t,” Hannan said.

©2025 Advance Local Media LLC. Distributed by Tribune Content Agency, LLC.