IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Anderson University's Cybersecurity Courses Evolve With AI

The Center for Security Studies and Cyberdefense at a private Christian university in Indiana is training students to identify potential misuses of artificial intelligence in a variety of cybersecurity environments.

Anderson University, Indiana
Shutterstock
(TNS) — In David Dungan's view, the programming and curriculum at Anderson University's Center for Security Studies and Cyberdefense aims is, in one way or another, to answer a rather uncomplicated question.

"Bad guys want your stuff, so how do we keep them from getting it?" said Dungan, the center's executive director.

The advent and rapid development of artificial intelligence chatbots, including ChatGPT, Google's new Gemini app, HuggingChat and others, has experts across wide swaths of industry, commerce and government scrambling to keep up with its potential ramifications.

Dungan believes the reality is no different in academia. He said universities like AU face daunting challenges in training future employees to identify potential misuses of AI in a variety of cybersecurity environments.

"The scariest thing is that security always lags behind the offender and the concept," he said. "The speed is what is scary — how quickly they can adapt their tools, techniques and protocols to exploit others. But the other side of that is, we have the ability to do the same thing on our side."

Leaders at AU and elsewhere are recognizing that AI's far-reaching effects will continue to shape research and policymaking in government, health care and other areas.

"If you're online, you've been affected," AU President John Pistole said. "Our program, as it continues to mature, will inform others of the benefits of AI, as well as the risks of using it, such as a student relying on ChatGPT exclusively to generate a term paper. It's a resource, not a panacea."

In the classroom, the technology's rapid evolution has challenged educators to reevaluate their approaches to everything from curriculum choices to the types of resources they deem permissible for student use.

"Really the last five years or so, we've seen huge progress on these (AI) models," said Joe Craton, an assistant professor of computer science at AU. "That is forcing us to rethink how we write code, for example."

Craton said programs exist that allow a user to begin writing a line of code before the program auto-completes it, much like the autofill function on text messaging apps.

"We've had to think about, to what extent do we want new learners using those tools versus learning to do things themselves?" he said. "It's really been a long process thinking through how we teach and think about these topics."

Dungan and his staff are working to create a statewide network of smaller colleges that will collaborate on creating more customized learning tracks based on feedback from businesses in several sectors. AI's rapidly expanding footprint, he noted, means an emphasis on "producing graduates that are meeting those needs as closely as possible to the realistic timeframe.

"Traditionally, academia has told the business world, hey, here's what we're producing, here's what you're going to get," Dungan added. "What we're doing here is flipping the script on that and saying, business, what do you need?"

©2024 The Herald Bulletin (Anderson, Ind.). Distributed by Tribune Content Agency, LLC.