Making the case for local LLMs at the California IT in Education Conference this week, IT Director Domingo Flores of Merced County Office of Education said different-sized districts will weigh the pros and cons differently. In the pros column: greater data security and privacy compared to using ChatGPT, customization to district needs, cost efficiency and resilience.
“It’s not going to be as cheap as a ChatGPT or Copilot subscription, depending on how big your team is, but the upfront hardware costs, we have found in our initial analysis, are more cost-effective than paying for a large subscription for many users in our organization,” Flores said.
Potential drawbacks of local LLMs, he said, include upfront hardware costs, maintenance and expertise required, the limitations of smaller models, and challenges with scaling. Flores said chatbots like ChatGPT or Claude will be more advanced than ones developed in-house, but the trade-off in security is worth it to some districts.
“I’m OK being a step behind, with how quick AI is being run right now,” he said. “We still get 90 percent of the benefits while keeping 100 percent of that data security with ourselves, so I’m OK with that.”
Flores outlined use cases for local LLMs in several departments: For the business office, because it’s safe to interact with internal documents, a local LLM could help with budgeting, reviewing contracts, and ensuring accuracy and compliance. For HR, it could draft job descriptions and generate interview questions. For IT, it could use RAG (retrieval-augmented generation) to support troubleshooting or update network topology maps without sending data to an external cloud service. For the classroom, it could create multilingual tutors for personalized learning, translations or academic support.
To mount their own local LLMs, Flores said districts should start by building proofs of concept for the business office, HR, IT RAG, the bilingual tutor or other intended applications. He recommended using open source tools like Ollama, a user-friendly tool for downloading, installing and managing LLMs on a local computer, in tandem with Open WebUI, a self-hosted AI interface that acts as a web-based front end for interacting with large language models locally.
Once the proofs of concept are built, though, Flores said schools should contract with companies like Cisco and NVIDIA for enterprise support with scaling and implementation.
“Personally, I’m not willing to put something in place to be used on a large scale unless we have enterprise support … If I have a problem with a particular model, or being able to access it, instead of jumping on Reddit or hitting some Discord software on Open WebUI, I can open up a TAC [Technical Assistance Center] case. I can call Cisco with our existing rep right now and get that enterprise-level support that we need,” he said. “I’m a big crawl-walk-run guy, and so this is the crawl stage for us. We are going to be walking with those proof of concepts, and we’ll get off and running once we have that partnership with Cisco in order to get that enterprise support.”