IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Preparing K-12 and higher education IT leaders for the exponential era

Colleges in Alabama Examine AI Usage in Classrooms

As generative AI models progress and gain more attention online, Calhoun Community College and Athens State University are both working to update their AI policies for students.

Class full of students looking at chalkboard with "AI" written on it
Adobe Stock
(TNS) — As generative artificial intelligence (AI) models progress and gain more attention online, Calhoun Community College and Athens State University are both working to update their AI policies for students.

"Calhoun Community College is still in the process of finalizing our policy on the use of artificial intelligence in the classroom. As AI technology continues to develop and evolve, our approach remains a work in progress," said Calhoun Vice President for Academic Affairs Wesley Rakestraw.

Athens State spokesperson Lauren Blacklidge said the university is also in the process of developing an AI policy.

Although most colleges identify some clear cases where AI is restricted, the line of what's considered academic dishonesty is sometimes blurry.

The use of AI currently falls into one of three categories at Calhoun: restricted, limited and integrated. For example, Calhoun's code of conduct bans the use of AI for automated written materials. Some departments, however, encourage the use of AI, such as the Computer Information Systems (CIS) division.

"Our approach in CIS is probably a little different than some of the other departments in the school," said Jeremy Blevins, department chair of CIS at Calhoun. "They're more worried about students plagiarizing using AI. But looking at where our students are going to be getting jobs, we know that they're going to have to leverage AI as a part of the work."

With a growing cybersecurity field in north Alabama, CIS works to give students experience with relevant technologies while also teaching how to use it ethically.

"One of the things that schools need to do is understand what the industry needs so that we can prepare students for those specific roles," Blevins said, "because if we're just broadly teaching concepts, but they can't make direct applications when they get to the interview ... it hasn't benefited them."

He acknowledged that academia can move slower than the commercial industry, though. With technology rapidly evolving, CIS tries to strike the balance between educating students about technological trends and not changing curriculum every time a new AI development occurs.

"The problem with jumping onto whatever is new and flashy too early is that you may spend a lot of effort building into something that's just going to flop in a year," Blevins said. — Practical application

One network security class at Calhoun uses AI to learn what bad code looks like when discussing code vulnerabilities. Many of the students in the class aren't yet well-versed in multiple programming languages. Using AI to generate code allows them to examine issues in areas they're not as familiar with, Blevins said.

Another assignment involves creating a cover letter and a resume for a job posting, then using AI to enhance them. Many companies use applicant tracking systems now, which run applications through AI-assisted filtering processes before suggesting candidates to review. Blevins said students will use AI to assess their cover letters and resumes against the description to understand how to work with these systems.

Blevins said students can also ethically leverage AI outside of the classroom. For example, students can ask a model to explain a concept they don't understand well or to quiz them while studying.

Like any use of AI, though, Blevins said students shouldn't automatically take each answer as truth. He referenced a concept known as GIGO: garbage in, garbage out.

AI models are trained using human-produced content. Because people have biases and make errors, AI can then repeat these mistakes.

AI can also hallucinate — or produce incorrect or illogical answers.

"If you're a learner, how do you know that you can trust the information that it's given you back?" Blevin said. "You check its sources. When I've gotten something from an AI that I was a little cautious about, I'll tell it to provide me its sources for its information. ... As a student, don't just assume the information that it's giving you is 100% accurate. It goes back to, I think, Ronald Reagan in the '80s had a quote: 'trust but verify.'"

Although CIS doesn't give out as many written assignments as other departments, Blevins and other instructors have come to recognize key giveaways that students are using AI when they shouldn't.

"One thing that I think a lot of people are noticing right now, especially on written assignments, is the grammar is too good, and the language is too precise," Blevins said. "It's in technical terms that, especially at a community college, students wouldn't normally speak that way."

Additionally, if a student turns in an assignment that would be above the skill level taught in the class, professors can identify it as likely AI-assisted.

To help instructors better identify AI-produced assignments and understand the complexity of ethical AI usage in the classroom, Calhoun instructors have participated in training at Auburn University, according to Rakestraw. A course from Auburn's Biggio Center includes hands-on learning opportunities related to AI best practices, according to the Alabama Community College System.

ACCS schools are developing AI policies that adhere to the specific needs of their students and industries, said Abigail Carter, ACCS associate director of computer information technology instructional programs.

"Additional professional development sessions focused on AI integration are scheduled for this fall to further support our faculty in adapting to this rapidly changing technology," Rakestraw said.

Randy Sparkman, an independent AI consultant from Hartselle, advises that schools don't need to start from square one to create an effective AI policy.

"My advice to schools is not to develop a lot of new AI policies, but to use the ones you have already for computer use. That's the first thing," Sparkman said. "The second thing I tell them to do is to design their approach and govern their approach in community. Get a committee in the school building of people who are interested and let them collaborate on this to figure out what makes sense."

He also recommends people understand the technology and cultivate their own AI literacy.

© 2025 The Decatur Daily (Decatur, Ala.). Distributed by Tribune Content Agency, LLC.