IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Should Universities Build Their Own Custom AI Tools in House?

Some universities have developed their own on-premises generative AI tools for students and staff, which have the advantage of data privacy but may require considerable money and expertise to launch and maintain.

Illustration of a robot wearing a headset and working on a laptop while surrounded by four chat bubbles with different people in each one. Light blue background.
Shutterstock
As many officials and instructors at U.S. universities are acclimating to the use of artificial intelligence tools for aiding research and instruction, the growing popularity of generative AI has led some universities to develop their own in-house AI programs to better manage the technology and make it more accessible.

Among those schools is the University of California at San Diego, which recently launched its in-house TritonGPT, described on the university’s website as an open-source large language model (LLM) that can answer questions about the university and assist with course content generation and content editing, among other functions. According to UC San Diego’s CIO Vince Kellen, the platform allows the university to have more control over its data, which he said is one of the main benefits of developing on-premises AI tools like TritonGPT. He said that with TritonGPT, none of the data used on the platform goes to any third party.

“It’s entirely on-prem in our San Diego Supercomputer Center here at UC San Diego, so the data is completely under our control, and it never leaves our network border to go externally,” he said. “On-prem [AI] sort of solves that issue. You don’t have to deal with a [vendor] contract and privacy issues. … On the downside, we have to manage our own capacity and manage our own hardware.”

Noting similar benefits of custom AI tools in terms of data management, the University of Michigan also recently developed its own suite of generative AI tools with similar functions: U-M GPT, U-M Maizey and U-M GPT Toolkit. According to the university’s website, the most accessible level is U-M GPT, a free, university-hosted large language model that allows users to do chat-based queries, while U-M Maizey can analyze data sets input by students and faculty, and connect them to Google, Dropbox and Canvas. The website added that the U-M GPT Toolkit — the university’s most advanced option — gives users complete control over AI environments via access to an API gateway, which could be particularly useful for researchers and faculty with more technical knowledge of how AI and LLMs work.

Echoing Kellen, University of Michigan’s Executive Director of Emerging Technology and Support Services Bob Jones agreed that on-prem AI tools like theirs allow universities to better secure data and protect privacy, rather than leaving those concerns largely up to third parties like OpenAI, the developer of ChatGPT. Noting concerns about how the use of tools like ChatGPT among students could widen the digital divide, he said the development of UM’s in-house AI also allowed the university to make AI tools more accessible to students, both in terms of cost and how they function.

“We wanted to remove barriers [to using AI] for our community and beyond, and we focused on key capabilities that we thought were important,” he said. “Our data is private to the community. We do not pass any information along to the actual large language model for training or a change in the algorithm, which is sort of a key difference between us and OpenAI. Our environment is also accessible via screen readers, and we work with our accessibility team to ensure that it’s available to as many people as we can possibly make it.”

While there are significant benefits to developing in-house AI tools, Kellen said it may still be difficult for universities and colleges with more limited resources to create and deploy their own generative AI platforms for similar functions. In the case of UC San Diego, he said, the university already had some experience working with AI prior to last year’s launch of ChatGPT and the growing popularity of AI tools to assist with instruction, grading and generating course content.

“We have been hosting a machine-learning GPU-based cluster for students and faculty to use in classes for the last six years, so we’ve had a long history of running different AI models for the academic enterprise. … We just decided to take what we have been doing and extend it further,” he said. “It’s going to be harder for smaller institutions to [develop in-house AI] because of skill. We happen to have a lot of skill in our university between our student population, our faculty population and my own staff.”
The advice I have for other institutions is to not lock yourself in just yet and watch cost carefully.
UC-San Diego CIO Vince Kellen
Touching on the need for resources such as expertise, Jones said partnerships with tech companies could play a role in helping other colleges and universities to get similar projects off the ground.

“There was no road map for that,” he said. “We ended up partnering with Microsoft and we currently use their Azure environment to enable some of this, but we also found open-source tools and other solutions to enable the environments that we wanted to create.”

But as more people become acquainted with and accustomed to AI tools and how they work, Kellen said it will be easier for more universities to develop their own in house. He added that the costs relating to adopting and creating AI platforms are also likely to decrease in the coming years, making similar projects more feasible for institutions with limited resources.

“Many startups are [creating AI tools] with, like, two to three recent graduates,” he said. “The advice I have for other institutions is to not lock yourself in just yet and watch cost carefully.”

As the field of AI and machine learning continues to advance, Jones said colleges and universities must be willing to go through the process of seeing what works and doesn’t work to effectively develop their own AI tools that meet their institutions’ needs.

“It’s about embracing the challenges of the unknowns, and if you’re willing to do that, you’ll figure something out,” he said. “I think that’s kind of what drove the development process, was the willingness to think big and fail.”
Brandon Paykamian is a staff writer for Government Technology. He has a bachelor's degree in journalism from East Tennessee State University and years of experience as a multimedia reporter, mainly focusing on public education and higher ed.