IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Preparing K-12 and higher education IT leaders for the exponential era

How Is UW-Madison Navigating AI Additions to Software Tools?

Spokespeople from the University of Wisconsin–Madison say new AI features added to some tools students and staff are using may require additional “evaluation on multiple levels from the university.”

UW madison
(TNS) — A software program UW-Madison faculty and students use on a daily basis has added artificial intelligence tools to assist with grading and summarizing discussion posts.

But the university says some of the tools could run afoul of guidance it provides instructors against using AI to automate student feedback.

Canvas, which UW-Madison and many other universities across the country use to manage coursework, has embedded the new AI tools directly in the program. Faculty and administrators will control the integration of the AI tools and can disable features in individual courses or assignments.

Educators will now be able to create quizzes to test students and grade assignments with a feature that takes a first pass at the scoring rubric and provides feedback — all using AI.

There’s also a new “ChatGPT-like” assignment educators can create by setting prompts and evaluating students’ responses while conversing with the AI tool.

The AI tools were announced last week as part of Canvas parent company Instructure’s new AI software, IgniteAI, and a partnership with ChatGPT’s maker, OpenAI.

The AI features, which educators can choose to use or not, likely require additional “evaluation on multiple levels from the university,” UW-Madison spokesperson John Lucas said in a statement.

“Overall, per the campus AI guidelines, instructors are advised not to use AI to automate student feedback,” Lucas said in a statement. “We continue to encourage instructors to be transparent with students about how they are using AI in instructional settings.”

UW-Madison’s AI guidance says instructors can’t include any identifying information of a student if they submit their work into AI tools to assist with feedback.

The university doesn’t have any immediate plans to launch the tools, Lucas said.

In recent years, UW-Madison has expanded its AI investments through research, including its Wisconsin Rise Initiative, and created courses for students to learn more about the emerging technology.

A professor can prescribe the use of AI tools to fulfill an assignment specific to the course at UW-Madison, but students are subject to disciplinary action under UW-Madison’s academic misconduct policy for using “unauthorized materials or fabricated data in any academic exercise,” according to state policy.

CAUTION ADVISED


Dietram Scheufele, UW-Madison professor in the Life Sciences Communication Department, teaches a course called Science, Media and Society that studies AI. He said he could see faculty and staff at universities being tempted by the convenience of grading assistance and a quiz question generator, but the technology is still not fully developed.

AI often “hallucinates” information that is made up based on the training data that it’s been fed, Scheufele said. One prominent example of AI hallucination has been in law, where attorneys have used AI to draft legal documents and it provided false information, he said.

“They’re going to live in a world where that’s not the new normal, but the normal,” Scheufele said. “But the moment we’re going to start as teachers to try and outperform them by using AI on our end, we’re going to have this arms race of who uses AI better, and we’re going to end up with the lowest common denominator of knowledge rather than this really rich experience that college should and can be.”

Instructure says the information the AI collects about Canvas users through the new tools won’t be shared with OpenAI.

Scheufele said it’s known that many college students already use AI in their coursework even if it’s unauthorized at their university, but it can be informative for some real-world tasks that employers expect students to understand after they graduate.

“What we want them to be able to do is use AI for all the things where it makes their life easier and accelerates progress and will make for better societal outcomes while keeping all the abilities that will be so fundamentally necessary,” Scheufele said.

Those include thinking creatively across different fields, something “AI simply can’t do — and can’t do yet and probably will never will be able to” at levels humans can do, he said.

USE WIDESPREAD


A 2024 global survey of nearly 40,000 college and graduate students by the Digital Education Council found that 86 percent say they use AI in their studies.

David Williamson Shaffer, a UW-Madison professor of Learning Analytics and Learning Sciences, said an AI grading aid isn’t far from a teacher’s assistant comparing work to a grading rubric a professor provides for a large lecture class, but the AI tools have been known to introduce bias.

“The idea that the professor isn’t grading everything isn’t entirely new,” Shaffer said.

Shaffer is teaching a new course this fall that requires students to use AI tools in every assignment in a thoughtful way.

“Rather than leaving AI completely on the outside and saying, ‘Well, we either just shouldn’t use it,’ which isn’t going to work or ‘Use it but we’re not going to take it seriously,’ isn’t going to work,” Shaffer said.

“The most important thing from an educational perspective is to be able to understand and incorporate it into what we’re doing. So, I think from a social standpoint, is to put some legislative guardrails around what AI companies can and can’t do.”

©2025 The Wisconsin State Journal (Madison, Wis.). Distributed by Tribune Content Agency, LLC.