IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Preparing K-12 and higher education IT leaders for the exponential era

Virginia Colleges Take Varied Approach to AI Education

Amid warnings about the need for AI literacy and the possibility of job losses, professors at institutions like Randolph College, Liberty University and the University of Lynchburg are developing their own use policies.

Yellow AI student robot with book, related to AI in school or classrooms
Shutterstock
(TNS) — Colleges in the Lynchburg area are teaching students responsible use of artificial intelligence, and how to use it as a tool to prepare them for their futures beyond the classroom.

Just this week, U.S. Sen. Mark Warner, D-Va., warned local university presidents of potential job dislocations that may emerge among college graduates due to AI.

Artificial intelligence, or AI, refers to computer systems that perform complex tasks normally done by humans, such as reasoning and creating. Generative AI systems, such as ChatGPT, can create text, images and other content when prompted by a user, according to the U.S. Government Accountability Office.

ChatGPT was released in late November 2022 by the artificial intelligence company OpenAI as a generative artificial intelligence (GAI) chatbot. It quickly gained popularity and was temporarily banned in New York City public schools.

Since then, attitudes toward AI in education have shifted to incorporate the technology into student learning, including in Virginia’s colleges.

PROFESSOR DISCRETION


Most local institutions are leaving it up to faculty to develop their own classroom AI policy.

Randolph College doesn’t categorically forbid the use of GAI tools for student academic work, but does require that instructors establish their own rules on the use of GAI in their courses, according to school policy.

“Instructors have the authority to determine how or if they allow their students to use these tools in their classes,” the policy states.

RC Vice President for Institutional Effectiveness John Keener told The News & Advance the policy came out of an institution-wide task force he chaired during the 2024-25 academic year to study the possibilities, uses and effects of GAI on the classroom and workplace.

Central Virginia Community College’s Coordinator of Professional Development Michael Babcock said AI usage at CVCC is also decentralized, meaning every professor has different AI policies and tools in their class.

“We do not have a single, uniform college approach,” Babcock said.

Also an English professor, Babcock said he has an open policy with his own students where he discusses AI and models the technology in the classroom. He helps students set up ChatGPT accounts and teaches them appropriate and inappropriate ways to use it in an academic setting.

“I talk with my students a lot about how I use it in my own personal, as well as my own professional, life,” Babcock said. “And I think that that’s a really important way to humanize the technology.”

At the University of Lynchburg, Chief Educational Technology and AI Officer Charley Butcher said he knows AI won’t be incorporated into every single course because some faculty will hold out.

“You’ve got to model lifelong learning if you want your students to be lifelong learners,” he said during a webinar in August where UL leaders discussed how the university was embracing AI.

AI DETECTION TOOLS


At Liberty University, students are prohibited from producing AI-generated content, but the university does allow ethical AI assistance, said Alexander Mason, associate dean of LU’s College of Arts & Sciences.

He said the university’s policy prohibits using AI to generate wholesale content for assignments or any other AI assistance that “compromises originality or authenticity.”

The policy permits basic spelling or grammar correction, inspiration, brainstorming support and any ethical assistance that doesn’t author content.

“So, students can use it for brainstorming, editing, feedback, but not for generating their assignments entirely,” Mason said.

He said LU also employs sophisticated detection tools and prevention measures, including Turnitin, that analyze a student’s writing and scores it with a percent probability that it’s AI-generated.

He said Turnitin flags traditional forms of plagiarism and potential misuse of generative AI tools.

“Liberty is encouraging students to use it in a number of creative ways without being unethical, but we’re also trying to educate students on why their integrity matters in this,” he said.

Butcher said UL has eliminated “I gotcha” tools such as Turnitin and GPT Zero because they are a disservice to students.

“That’s the approach that we’re going to take instead of saying, ‘No, you can’t do that,’” he said.

Babcock said he thinks AI detectors are virtually worthless despite widespread interest in them a few years ago.

“We don’t have any confidence in those anymore,” Babcock said. “Nobody does, and so you have to rely upon other methods to enforce an attitude of academic integrity.”

Babcock said one solution he has explored is through “really old-school” class assignments, such as handwritten assessments, to measure a student’s baseline abilities.

“So, you know what the student’s non-AI writing ability is,” he said. “We can use that as a baseline against which you can compare later submissions,” he said.

He said oral exams are becoming more common in college.

“This is going to be the future,” he said. “We’re going to do more of this, and just basically getting into a room where there is no AI, and just talking with the student face-to-face.”

AI AS A TOOL


Higher education institutions in the area may have different approaches to their AI student usage policies, but they agree students should only use AI as a tool.

RC’s Keener said GAI shouldn’t be used as a substitute for creative and critical thinking.

Rather than replacing a project’s overall work, he said GAI is there to provide more efficiencies in the steps of a project.

“It should, generally, all things being equal, be approached as a way of being more efficient in the steps inside a task,” he said. “It shouldn’t be a shortcut for the overall task itself.”

Liberty University is trying to draw the same line for students on AI’s role, Mason said.

“AI can be used as a support tool, but it’s never a substitute for actual human work,” Mason said.

Butcher said Gemini, Google’s AI assistant and UL’s adopted AI platform, can act as a tutor for students that won’t give them the answer but will help them find the next step.

“You have those conversations with Gemini, and it changes the entire equation,” Butcher said.

Butcher said critical thinking is a key to AI usage.

“Being able to really, intentionally think about the prompts that you’re putting in and then vetting whatever comes out of the AI tool, regardless of the tool that you use,” he said.

Babcock said he tells his students to use it as a way to supercharge their own creativity or thinking.

“It becomes a way to augment your thinking and to sort of act as a backboard for your own thinking,” he said. “And that’s really what I try to model because I know that that is what’s transferable for my students outside the classroom.”

ENVIRONMENTAL CONCERNS


Many experts have voiced concerns about the environmental impact of AI, which is resource-intensive and can require a high amount of electricity and water.

Lisa Powell, Sweet Briar College vice president of academic affairs, dean of the college and chief sustainability officer, said SBC is working to build understanding of AI and its implications on campus, especially as it relates to SBC’s focus on environmental sustainability.

Butcher said some UL students are concerned about AI’s environmental impact.

In the fall of 2026, he said UL is implementing a program called Elevate, where every student will receive a MacBook Air with the M4 chip as part of their technology fee.

He said this chip has AI embedded inside of it, which means easier requests won’t have to travel to data centers.

“We hope that’s a step in the right direction,” Butcher said.

Babcock also acknowledged the environmental footprint associated with AI usage.

“It’s one of the things that I introduce my students to as well,” he said. “Namely, that as we continue to grow into this AI age that we’re in, we’re going to have to adapt to the very real infrastructure and environmental impact of that, and that’s very much a work in progress,” he said.

PREPARING STUDENTS


Mason said LU’s AI policy is helping students gain skills of integrity and adaptability.

“They’re learning to use AI responsibly while also making sure that we don’t neglect the mastery of skills that’s necessary to exist in the world,” he said. “So, it ensures that our graduates are going to be able to thrive in a workplace that uses AI without losing their originality or their Christian ethical grounding in that.”

UL Senior Director of Academic Initiatives Sandra Perez said UL is trying to prepare its students for their futures and careers.

“There are very few careers out there right now that they’re not using AI,” she said. “So, let’s teach them how to do it and to do it well so that they are the leaders in the industries that they’ve chosen.”

© 2025 The News & Advance, Lynchburg, Va. Distributed by Tribune Content Agency, LLC.