IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Yale Student Suing Over Accusation of Improper AI Use

Yale University professors flagged unusually long, elaborate answers on an exam as possible cheating with an AI chatbot. Now the student alleges they discriminated against him, a French native residing in Texas.

Closeup of a bronze statue of Lady Justice holding a set of scales.
(TNS) — A Yale University student is suing the university, alleging he was suspended from his School of Management program for a year for misleading the Honor Committee after being erroneously accused of using artificial intelligence on an exam.

The plaintiff, who filed the case as a John Doe, is a French native residing in Texas who was enrolled in the Masters of Business Administration for Executives program according to the 50-page lawsuit filed Feb. 3.

The lawsuit seeks a jury trial, alleging the university as well as three professors and two deans who taught a course in which the plaintiff was enrolled or serve on either the Honor Committee or the Honor Committee's appeals board, discriminated against the plaintiff as a non-native English speaker, attempted to coerce a false confession and denied him due process rights.

As a result of an investigation, the plaintiff said he was suspended for a year from his program and was given a failing grade in his Sourcing and Managing Funds course. He alleges in the suit that the university and named defendants violated the Civil Rights Act because of his national origin.

The lawsuit alleges that a professor said in a June 11, 2024 email to Honor Committee officials that a teaching assistant flagged the plaintiff's exam as having answers that were "unusually long and elaborate in formatting in answering the questions" and used "near perfect punctuation and grammar." The exam was open book but closed Internet, according to the suit, and the instructor ran some of the plaintiff's answers through ChatGPTzero as a detection tool to assist in confirming suspicions.

In the lawsuit, the plaintiff noted that one Yale department notes that no artificial intelligence tool can detect artificial intelligence use with certainty.

Yale University did not respond to a request for comment this week.

Colleen Bielitz, associate vice president for strategic initiatives and outreach at Southern Connecticut State University, said artificial intelligence presents both a challenge and an opportunity for educators. Whereas most educators accept it a basic reality that they are preparing students for a future where generative language models and other forms of emerging artificial intelligence are integrated into the workplace, it's also presented new challenges for both integrity and rigor in academia.

"I don't think anybody really has a handle on even how widely it's being used, or if it's not being used, what those percentages are," she said. "As artificial intelligence tools grow more sophisticated, I feel like artificial intelligence-enabled cheating has probably become more sophisticated as well."

Bielitz did not comment on the specifics of the Yale University lawsuit, but acknowledged that many of the issues arising from the suit reflect themes that educators are currently wrestling with in the classroom. Bielitz said she has, on rare occasions, had students submit work to her that she knew was copied directly from ChatGPT, as students copied-and-pasted a disclosure saying as much into their assignments.

She said that many educators can also discern whether the writing in a students' assignment matches with their previous output. However, there is no way to be absolutely certain, she said.

Bielitz said one remedy for educators is to increase experiential learning opportunities, placing less of an emphasis on written responses in instruction. However, she said written assignments are very critical in helping students to process their learning.

"Everybody wants to get to the finish line quicker and we have to get to the heart of what inspires us to be educated. Learning, as with life itself, is sometimes difficult, but getting through those lessons is how we learn," she said. "That deeper thought is what lets it sink in."

Bielitz said that as the understanding of how artificial intelligence can be used both appropriately and inappropriately in educational contexts evolves that there may be growing pains during which professors who are unfamiliar with common characteristics of generative language model output do not detect plagiarism and cheating, but also some students may be falsely accused. She said she believes universities may need to catch up to artificial intelligence culturally.

"I think we need to make cheating the exception and the integrity the norm," she said.

She said she does not believe that can happen until it is better understood that using artificial intelligence without attribution constitutes plagiarism and academic dishonesty. She said she also believes that academia must make clearer that learning itself is a process that does not have instant results.

"You have to take time and sometimes sit through things to learn about them and write about them and research them. There's no quick conclusions to education; we can't just download the education in your brain, you have to take the time to sit with what you've learned," she said. "I want students to understand getting through the hard stuff is how the learning occurs."

©2025 the New Haven Register (New Haven, Conn.). Distributed by Tribune Content Agency, LLC.