According to early adopters, the role is still being defined in higher education, taking cues from CAIO duties in industry and government. CAIOs in education align AI innovation with research, teaching, administration and public service, managing ethical concerns, faculty apprehension and the often slow pace of academic governance.
BEYOND INFORMATION TECHNOLOGY
Chris Mattmann, UCLA’s inaugural chief data and AI officer, described the CAIO role as fundamentally different from the CIO position.
“AI will transform every part of our lives. It’s not just information technology. It’s workplace productivity. It’s the creation of marketing assets and materials for communications. It’s science and engineering, right?” Mattmann said. “Because AI is so broad, some organizations have decided, look, this isn't an IT-only thing.”
The breadth of the role has prompted some universities to move it outside of their IT departments. At UCLA, Mattmann operates from the administrative arm of the university rather than the technology arm, though he works closely with both.
“A lot of the questions I get asked sometimes are like, ‘Well, you’re the chief AI officer. Which tool are you going to use?’” he said. “I want to turn the conversation around.”
BUILDING STRATEGY BY CONNECTING STAKEHOLDERS
David Ebert, chief AI and data science officer at the University of Arizona, agreed with Mattmann that the role requires a broader view than choosing a specific tool. While Ebert has experience in computer science and data science, he said the CAIO position spans operational decision-making, faculty development, student services and public outreach.
Because the role is new and ever-evolving, leaders say they take a collaborative approach rather than arriving with a top-down mandate.
Amarda Shehu, the first vice president and chief artificial intelligence officer at George Mason University in Virginia, started by forming an AI visioning task force with representatives from every college and nonacademic unit. The task force quickly produced a set of universitywide guidelines for ethical AI use, a first for any Virginia university, according to Shehu.
“This role is a connector, in the sense that he tries to work with all the colleges to have a concerted AI strategy, but then to also have it customized. What does that look like for business? What does that look like for education? What does that look like for the humanities?” she said. “You ideate strategy, but you align it with the colleges, and then you mobilize the different offices in order to go from strategy to implementation. That’s why you need somebody that is creating these connections.”
The task force’s collaboration has also sparked program-specific AI updates, including a master’s degree in AI, a public health AI concentration and an AI ethics minor crated in collaboration with the humanities department.
Before moving to Arizona in April, Ebert had taken a similar path when he briefly pioneered the CIAO role in an interim capacity at the University of Oklahoma. He engaged faculty and administrators individually, but also worked with governance groups like the faculty senate, staff senate and student government association. He said that kind of input can shape working groups that can act quickly in response to emerging technologies and concerns.
At UCLA, Mattmann’s team is housed within the Digital and Technology Solutions unit, revamped from the IT services department. They used a competitive proposal system to distribute 1,000 enterprise licenses for OpenAI tools across campus, including for teaching, research and administration.
“We said, ‘Hey, campus, how would you use these OpenAI ChatGPT enterprise licenses, if you got one?’” he said. “Now we have over 68 projects, almost 70 projects, at the enterprise level and we have a much better understanding of how people would use AI on the campus.”
MANAGING COMPLEXITY AND COMPREHENSION
One of the central challenges facing CAIOs is managing the wide variation in attitudes toward AI. Shehu was surprised by the range of opinions uncovered in student and faculty surveys put out by her task force.
“For completely legitimate reasons that I even empathize with, we had instructors that said, ‘No. We should not integrate AI,’” Shehu said. “This came from students that create. They create dance, they create art, they write poems, right? They were very adamant.”
Rather than prescribe uniform adoption, George Mason’s guidelines emphasize agency. Instructors and staff can choose how and whether to use AI.
It really became apparent that what you need to have is a safe environment where people with every level of experience are able to get engaged. You’re there to upskill everyone, to have people help each other and create a network of support.
David Ebert, chief AI and data science officer, University of Arizona
At the University of Oklahoma, Ebert encountered similar concerns. Some students expressed fears that their work might be graded by AI without faculty oversight. Others worried whether faculty had been properly trained to use AI responsibly.
“It really became apparent that what you need to have is a safe environment where people with every level of experience are able to get engaged,” he said. “You’re there to upskill everyone, to have people help each other and create a network of support.”
For Ebert, this meant establishing mentorship models, office hours and upskilling initiatives tailored to a range of comfort levels.
A ROLE STILL TAKING SHAPE
While few institutions have chief AI officers today, those in the role say its scope is only growing.
“There’s no manual that tells you how to do this job,” Shehu said.
While private industry tends to prioritize speed, higher-ed CAIOs are building deliberately and collaboratively for the long term.
Mattmann, who previously served as chief technology officer at NASA’s Jet Propulsion Laboratory and contributed to federal AI policy, said building internal capacity will be key and cautioned against overreliance on commercial vendors. Upskilling can’t just be about prompt engineering, he said, but needs to encompass foundational data literacy that can drive innovation.
He pointed to national labs, once the center of supercomputing, that laid the groundwork for commercial innovations. Strong leadership on AI in higher education could help education play a bigger role in shaping AI trends and governance overall.
“It’s a discipline that we should be leading in higher education and not ceding to commercial industry,” he said.
Editor's note: A previous version of this story suggested that David Ebert was talking about his current role as chief AI and data science officer at the University of Arizona. It has been updated to reflect that some of his comments referred to his past work as interim CIAO of the University of Oklahoma.