IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

N.C. Tech Leaders Discuss Possibilities, Pitfalls of AI Tools

Some small-scale AI use cases are already underway. But in a webinar, “The Local Imperative: Policy and Use of AI,” state and local officials in North Carolina urged caution before employing the new technology in wider applications.

Image of a woman's hand activating an AI button
With artificial intelligence applications still in relative infancy, North Carolina public-sector tech leaders on Wednesday emphasized being cautious and inquisitive before adoption — and not testing them with sensitive data.

Their own explorations are, in some cases, already underway, said city, county and state IT heads during “The Local Imperative: Policy and Use of AI,” a webinar organized by TechConnect. But much about the technology, they warned, remains to be discovered.

“There is so much that is not known about the behaviors of these models. It’s really complicated. It’s fairly chaotic,” said Jonathan Feldman, chief information officer at Wake County, N.C. “And so the last thing you want to do — and I think this is something we have support of at every level in Wake County — is, nobody wants to be the one who ends up in the newspaper because they used an AI model to do something goofy with data.”

The county is leading an effort known as the “enslaved persons project,” where volunteers comb through records to better understand the history of enslaved people. Information Services officials are looking into using AI to assist in document review.

“That’s in flight right now. And it’s really exciting. Because it potentially saves the volunteers a lot of time,” Feldman said.

The town of Chapel Hill is using generative AI to rewrite documents and policies in a way that’s more user friendly for the public, translating them out of government speak into more understandable and accessible language. Officials can even tell the technology at what grade level they’d like the document to be written in, CIO Chris Butts said. Data being fed into generative AI is, however, along the lines of commissioner memos or job descriptions — information that’s already publicly available.

“We’re still exploring this. And some of those use cases have just been some easy low-hanging fruit, where we can get some easy wins,” he said, adding that for now, the guiding rule is: “Don’t put any of that sensitive data in there yet.”

Keith Briggs, head of enterprise architecture and innovation at the North Carolina Department of Information Technology, offered attendees a blunt blueprint for use.

“Don’t use Internet, open source, generative AI,” Briggs said. “And when you do use a secure AI capability, make sure you do it responsibly. And a part of that responsible use is validating the output.”

And Mark Wittenburg, CIO at the city of Raleigh, said AI is not an outcome — it’s a tool, and urged "healthy guidelines and guardrails" instead of fear.

“But really, I think it’s important for us, especially as IT leaders, to really explore what the technology can do,” Wittenburg said. “And then be very mindful, again, about the community, the impacts to the community, and positive and negative impacts that it can potentially have.”
Skip Descant writes about smart cities, the Internet of Things, transportation and other areas. He spent more than 12 years reporting for daily newspapers in Mississippi, Arkansas, Louisiana and California. He lives in downtown Yreka, Calif.