Unlike ChatGPT, the custom-built CSU-GPT and its forthcoming companion RamGPT were designed on a secure network, CSU administrators said.
That means information submitted by students, staff and faculty will not be used to train Microsoft’s broader large language model, allowing sensitive research and student information to be entered into CSU’s chatbots safely, said Brandon Bernier, vice president for information technology and chief information officer.
“We get this secure, private fence that goes around all of our data so whatever happens here stays here,” Bernier said in an interview. “The models aren’t getting trained on it. All of our institutional data is then protected, and that’s an important thing for our researchers, students and faculty.”
Users can ask generative AI chatbots questions or give them prompts, and the programs draw information gathered from the Internet or programmed into their network to create text responses, images and sounds to fulfill the queries. (The Denver Post and other newspapers have sued ChatGPT maker OpenAI and Microsoft, alleging the companies illegally harvested copyrighted articles to create their generative AI systems.)
CSU’s decision to invest in widespread AI adoption has elicited a range of reactions from the campus community. Bernier acknowledged that “there are folks who want us to use it for everything and then there are the folks who don’t want it to be used for anything.”
University leaders emphasized the need to prepare students for the future workforce and the potential for staff and faculty to use the technology to relieve administrative burden by offloading academic drudgery. Some faculty members expressed enthusiasm over the institution’s commitment to the budding technology, while others were wary about encouraging plagiarism and skirting critical thinking, as well as AI’s environmental toll.
“I am really, really disappointed in the whole thing,” said Hannah Parcells, a CSU senior double-majoring in political science and psychology. “I just think encouraging the use of generative AI, which inherently allows students to think less, is really antithetical to an educational institution.”
CSU leaders are encouraging the campus to take a balanced approach when sizing up the new initiative. RamGPT will be tailored for students when it launches this spring, while CSU-GPT, which debuted in October, was designed for the campus community at large, they said.
The university paid Microsoft $120,000 for the AI initiative in 2025 and will spend $142,000 on it this year and $142,000 next year, according to university spokesperson Tiana Kennedy.
A representative from Microsoft declined to be interviewed for this story.
“It feels like we have been on this journey from catch-up to capability,” Bernier said. “AI technology moved quickly, and it’s been a much more rapid evolution than some other technology cycles … with the help of Microsoft and a number of other folks, we’re in a spot where we’re leading … We created this strategic partnership with Microsoft, and we are one of few institutions who have this.”
MIXED FEELINGS
According to the university’s website, users can upload files like class schedules or academic calendars into CSU-GPT, ask it questions to receive “current” and “factual” information, and use the program to “generate ideas, write content or analyze data.”
David Edwards, CSU’s director of web services, is reaching out to university employees about what administrative tasks feel most tedious and how he can better automate them using CSU-GPT. The programs are being fed information about university-specific resources and protocols so a student could query for mental health help or yoga classes offered on campus and find everything they need in one spot, he said.
The technology could also be used like an online career center, Bernier said. The generative AI could assist in making resumes, writing cover letters, creating LinkedIn profiles or preparing for job interviews, he said.
“We’re not wanting to take anybody’s work and replace it with AI,” Edwards said. “This is something that can help you do your job. We want to make sure there is a human in the loop. This is not automation. This is not taking people out of any of these processes.”
Bruce Draper, a CSU computer science professor, said it’s a positive thing that his university is investing in this technology.
“The world is changing so quickly that I would be uncomfortable if we weren’t doing something like this,” Draper said. “Not that there aren’t going to be issues. It’s not going to be perfect on the first rollout.”
The majority of other higher education institutions are proceeding more cautiously, according to 2025 research from the Chronicle of Higher Education.
The publication surveyed 93 technology leaders at two- and four-year colleges in the United States. The survey found 13 percent of respondents said their institution was moving “full speed ahead” in their generative AI approach, 53 percent said they were moving gradually and a third said they were proceeding “slow and cautious.”
In October, CSU held an event on campus launching the Microsoft partnership, at which President Amy Parsons said the AI landscape is changing quickly. This was a reason to lead in the space instead of hesitating, she said.
“AI has potential to transform just about everything we do, from unlocking new ways to solve global challenges to supporting our daily workflows,” Parsons said at the event. “Rams deserve a world-class education and the very best tools to prepare them for careers in industries where AI is a differentiator.”
Members of the computer science department discussed how best to test and introduce the technology. They decided to start small, Draper said, by testing it out on a few classes.
Draper and his colleagues are uploading course materials — textbooks, readings, PowerPoint slides, lecture notes — into the program so that a student with a burning question in the middle of the night could type it into CSU-GPT and find an accurate response, he said.
“I can’t tell you how often I get an emailed question I read in the morning, but it was sent at 2:30 a.m. and, of course, I didn’t answer it immediately,” Draper said. “The idea of having an agent the student can ask that question to and get an answer — most of the time, that is all the information they need, but it’s there when they need it.”
To prevent students from asking CSU-GPT things like “what’s the answer to question three?” Draper said he’ll input guardrails — like prohibiting the use of certain prompts — to deter cheating and plagiarism the best he can.
The computer science department is holding off on inputting student data like grades and personal information during the early phases of the rollout, Draper said.
“We’ll see how that goes down the line, but we’re very, very cautious where it involves anything with student data,” Draper said.
Meanwhile, in the writing and composition department, the forecast for the Microsoft partnership is a bit less sunny.
Genesea Carter, professor and associate director of the composition program, said she’s mostly heard worry and skepticism about the collaboration.
“Faculty and graduate students who I’ve worked with generally feel worried about the environmental impacts that generative AI is going to have and the intellectual property of generative AI, so I think they feel wary about this partnership,” Carter said.
Her students, on the other hand, are interested in learning how to use AI ethically, she said.
“I’m trying to figure out how to balance the ethical concerns and environmental concerns with the fact that these are tools students are expected to know how to use,” Carter said.
Parcells, the CSU senior, is so opposed to using generative AI that she said if it’s programmed into resources like Canvas, the online learning management system the university uses, she will devote time to figuring out how to disable it.
“I will be avoiding it like the plague,” she said.
Parcells is managing editor of CSU’s student newspaper, The Collegian. Increasingly, student editors are receiving articles they believe were written by AI, she said.
The senior is also a teaching assistant in an English course tasked with flagging work that reads like it’s been written by artificial intelligence — a problem so rampant, she said she’s had to comfort a professor who wondered why they were even in the profession of teaching young people to think and write when students were turning to AI to do it for them.
“I find it concerning to be encouraging the use of something that, at best, we don’t know what it does to the human brain and, at worst, is damaging to building those foundational skills necessary for more in life,” Parcells said.
Carter’s history professor husband used CSU-GPT to create a spreadsheet for his classes based on the academic calendar. But she said she doesn’t have plans to use the program at this time.
“I have not figured out how to prompt engineer it efficiently enough that it is faster than what I can do myself,” she said.
During her tinkering with CSU-GPT, Carter asked it to create an outline based on an assignment she taught in class.
“It was OK,” Carter said, describing it as “C”-level work. “It wasn’t great.”
'A MORE BALANCED APPROACH'
Criticisms of generative AI — its environmental costs, cybersecurity risks and offloading critical thinking and writing — can conflict with the conventional core tenets of higher education.
Companies are building new data centers to accommodate an increase in AI usage, and facilities inflate water consumption, energy usage and carbon emissions, according to the Environmental and Energy Study Institute.
“What is Microsoft doing to offset the environmental impacts of generative AI?” Carter said.
When asked what the partnership’s benefit to Microsoft was, Bernier said CSU is allowing the company to conduct pilots and proofs-of-concept with an institution investing in AI.
Edwards, the campus director of web services, acknowledged that there are people in the CSU community who are anti-AI.
“They don’t need to use it,” he said. “What they can see is the use of AI at CSU is helping the student experience regardless of whether or not they’re using it. It’s helping the faculty and staff provide our services — teaching and learning — more efficiently.”
Ultimately, leaning into AI and experimenting with it is a good way to learn more about the burgeoning technology, Bernier said.
“We’re really early in this technology and a lot of what people see or hear about it may not be factual yet,” he said. “All of the great things AI can do might not be known yet, and all of the bad things, too. Let’s just take a more balanced approach as we go.”
©2026 MediaNews Group, Inc. Distributed by Tribune Content Agency, LLC.