IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Professors Cite Data Privacy, Cheating Among Top AI Concerns

Based on a recent professional development course about generative AI, college professors still have reservations about data privacy, plagiarism, accessibility and mixed messages around the technology.

A hand coming from the left side of the image with it's palm facing up, mirrored on the right side of the image by a robotic hand. Hovering above the hands are a warning sign and the words "chat AI" as well as lines connected by dots to indicate digital connectivity. Gradient dark and light blue background.
Shutterstock
One year since the launch of ChatGPT, some initial fears about generative artificial intelligence (GenAI) tools in education have abated. Federal officials and nonprofits are publishing guidelines and best practices, new use cases are proliferating, and an August poll of about 600 higher-ed instructors by the ed-tech company Cengage Group found over 80 percent of them believe GenAI tools will play an increasingly important role in their institutions in the years to come. But if a recent professional development course from the ed-tech company Course Hero is any indication, many professors remain worried about data privacy, plagiarism, accessibility and mixed messages around the technology.

Course Hero Vice President of Academics Sean Michael Morris, who led a four-week “AI Academy” course for over 350 educators in October, encountered some of these concerns when leading sessions about using AI tools for assessment design and teaching AI literacy to students. He said aside from worries about academic dishonesty among students, some of the most common questions among educators looking to adopt GenAI tools for the classroom revolved around the data required to use them.

“What kind of data is required? What do you have to surrender in order to use the tool? Is it just your name and email address, or is it more information than that [for sign-up]? How is the AI being trained, and where is the data coming from that trains the AI? Can you detect implicit bias in the AI?” he said, listing frequently asked questions from educators about GenAI tools and data privacy.

Morris said another common concern was whether some AI tools are accessible for students with disabilities. While some programs have specially designed tools such as screen readers to address this, Morris said there are other hurdles.

“We want accessibility for disabled students, but also accessibility in terms of cost,” he said. “What kinds of costs are we paying to use the AI? … It immediately becomes this discussion about the digital divide — doubling down on the digital divide that already exists.”

One of Morris’ students in the AI Academy was Nicole Jones Young, a professor of organizational behavior at Franklin and Marshall College in Pennsylvania. She said one of the biggest concerns that remains when it comes to the use of AI programs in education is cheating among students, and how to combat and prevent it.

“Faculty are really concerned that the students are just going to plug in the question on the assignment and get whatever is on ChatGPT, copy and paste it and submit it. I think that’s the No. 1 biggest concern,” she said. “If it’s 10 p.m. and an assignment is due at midnight, there’s a good chance that they’re running out of time and might copy and paste one of the prompts and then give it back to me.”

Prior to the recent rise of GenAI tools, Young said she had used plagiarism-detection tools like Turnitin. However, since it’s more difficult to detect AI plagiarism now because GenAI tools draw from such large data sets, she suggested that educators teach students how to use GenAI as helpful tools, rather than as a means to complete assignments.

“The way that I’ve used [AI] in the past was more so for detecting plagiarism,” she said. “I thought another approach would be for me to figure out how to integrate [GenAI] and help my students use it and explain how to use it better.”

Despite the fact that more professors are warming up to using GenAI tools in class, Young said that students may be getting some mixed signals from instructors. She said she’s concerned about the lack of uniformity when it comes to policies around GenAI tools in higher education, as well as how apprehension among some instructors could impede mass adoption of GenAI tools in the years to come.

“I am very concerned about my faculty colleagues who are more resistant [to AI in education] because when you step into my class, I’m saying, ‘OK, we’re going to use it, and here it is.’ … Students may go into their next class and the faculty member is like, ‘I don’t want to hear about this and I’m shutting it down,’” she said. “It’s very confusing, I think, for students. I do wish there would be some kind of uniformity from more institutions.”
Brandon Paykamian is a staff writer for Government Technology. He has a bachelor's degree in journalism from East Tennessee State University and years of experience as a multimedia reporter, mainly focusing on public education and higher ed.