IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Preparing K-12 and higher education IT leaders for the exponential era

Montana High Schools, Colleges Prepare Students to Work With AI

High schoolers are learning about AI through peer-to-peer work and after-school programming like Code Girls United, and higher education institutions in Montana are prioritizing introductory lessons in AI for students.

University students sitting in a lecture hall attending a course
University students sitting in a lecture hall attending a course on artificial intelligence, learning the technology in their studies. AI classes taught in colleges, educating for the future.
Sweeann - stock.adobe.com
(TNS) — Whether it’s middle schoolers at Code Girls United or students at Flathead Valley Community College, young people of all ages are being introduced to ways to utilize artificial intelligence — as well as how to think critically when using it.

At the Code Girls United office in Kalispell, founder and Executive Director Marianne Smith savored her last moments of quiet before students showed up for an after-school lesson.

It’s the last day teams could submit projects for the Presidential Artificial Intelligence Challenge, in which students create use AI in an app or website to address challenges they see in their community.

“I'm very calm right now, but if you talked to me this morning, I would have been running around like a chicken with my head cut off,” she laughed.

Code Girls United is a nonprofit that provides free after-school programming for girls in grades four through 12 in Montana. In the first half of the school year, students learn the basics of computer science and then in the second half put that knowledge toward a service project.

Near-deadline chaos isn’t new at Code Girls as the organization participates in many national challenges, but the latest challenge is part of a larger effort by Code Girls United to incorporate more education into its programming about AI, including its new AI Academy.

Smith said the current landscape with AI reminds her of the rise of the Internet. Just as there needed to be guidelines for how young people interact with the World Wide Web, she believes kids need to understand the best ways to use AI.

“AI is not always right. It’s a tool and you need to be a responsible user. And right now, that's not the way it is. We're not paying attention and not teaching them a framework for it. I think that's the part right now that's really missing and that's the void we're trying to fill with our AI Academy,” Smith said.

Artificial intelligence is technology that enables computers to simulate tasks typically requiring human intelligence, such as learning, reasoning and problem-solving.

Though there is hesitancy around AI, there is enthusiasm about how the technology will be used in the future, particularly when it comes to career development.

The Presidential AI Challenge aims to do that, as part of the America’s AI Action Plan created by Trump Administration.

The challenge aims to train students in the responsible use of AI tools to “demystify the technology and prepare America’s students to be confident participants in the AI-assisted workforce, propelling our Nation to new heights of scientific innovation and economic achievement,” according to AI.gov.

Like the local and national app challenges, Code Girls United asks teams from across Montana to participate in identifying a problem in their community.

A previous project for one of the app challenges from a team in Red Lodge focused on dark sky preservation, letting app users know the best locations to view stars. Others focus on mental health resources or bullying prevention, like one group that created a lunch buddy app which connected students who needed a friend to sit with at lunchtime.

Smith said in a way, they’ve been working with AI for several years during these challenges, as they use the MIT App Inventor to create apps.

But students may be learning the most about AI through the organization’s new AI Academy, a peer-to-peer video series teaching skills and concepts of AI, including facilitating interesting discussions about ethical uses.

"These girls are having these really adult discussions [about AI.] Like, ‘Well, when is this ethical? Or when is this OK? Is this not OK?...’” Smith said. “’What do we think about deep fakes, has anybody had experience with this? How does it work?’ So, it's discussions that people aren't really having with kids right now. And they're using AI, whether adults like it or not.”

The peer-to-peer educational format for the videos serves multiple purposes, Smith said. The interns learn skills like public speaking and audio and video recording, while learning AI concepts themselves as they teach. Younger members get to absorb information in a familiar format — someone speaking over visuals and graphics, much like a video from a streamer.

Sometimes this looks like the hosts playing actual games — like Semantris, word association games powered by machine learning. Other times, this looks like introducing basics, like the three types of AI: narrow AI, general AI and superintelligent AI.

All of their videos are currently free to watch on Code Girls’ YouTube channel, AI Academy CGU.

Smith said the hope is to incorporate the AI Academy more into the regular curriculum this year. The organization is also currently pursuing grants to potentially launch the Academy outside of Montana.

For students a little closer to jumping into their respective career fields, Flathead Valley Community College faculty have been facilitating discussions about AI and allowing some use of it in classes.

FVCC has made engaging with and critically thinking about AI an important priority, according to Katie Clarke, assistant professor of communication. She said during the college’s in-service at the start of the school year, FVCC President Jane Karas played “The Times They Are A-Changin’” by Bob Dylan. A nod to the rapidly shifting landscape they find themselves in, but a reminder that change is constant — and it’s their job as educators to prepare students.

“At FVCC, I think the role that we play is teaching people not what to think necessarily, but how to think critically, how to engage, how to study the past — so that they can make informed, thoughtful decisions,” Clarke said.

“We'll never be able to keep up with all the changes, but if we teach them how to respond to change, then they'll be able to adapt accordingly. And AI happens to be one of the biggest adaptations right now,” added Eliza Thomas, associate professor of education, education director of early childhood and division chair of the social science department.

Clarke and Thomas are co-leaders of the Teaching and Learning Center, which has been leading discussions about AI with faculty.

Thomas said the college hasn’t made a universal statement about AI that all faculty have to adhere to, which works well, because it impacts different careers and programs in unique ways.

“For some it's a huge part of their career changing curriculum, and for others, it's imperative that it's not as much of an influence, because of the type of curriculum that it is,” Thomas said.

One of the first topics of discussion among faculty about AI was how to put up guardrails, Clarke said, but they quickly realized it’s a futile effort because the technology is changing so quickly. Instead, the approach would require flexibility and nuance.

Instead of using unreliable AI-detecting software, Thomas said it’s more practical to rely on what a professor knows about their students. Much like plagiarism, which is not tolerated, she said a professor is likely to notice if a student completely changes the way they write throughout the semester.

In her classes, she asks students to not use AI to answer writing prompts about personal reflections.

“I don't want AI's thinking. I don't want a conglomerate of that. I want to know you and what you think about this, and there really isn't a right or wrong answer if you justify your thinking and how you got here,” Thomas said. “But I don't mind if they use it to enhance their lesson plans or use it to come up with a rubric that's going to help them assess their future students. That's a great use of AI, in my opinion.”

Thomas said she’s also seen a student use AI to generate art inspired by a child she worked with for a project, whose parents were not comfortable with sharing photos due to privacy reasons.

“That was a fantastic use of artificial augmented intelligence that honored all of those different things. She's specific about citing where those photos and images came from, that those were not real photos and images. But it enhanced her presentation and was able to illustrate what she had done in an ethical way, for children's identity and the parents’ right to privacy,” Thomas said.

There are plenty of other ways professors are encouraging students to use AI for studying or workforce preparedness.

Dawn Rauscher, chair of the business and technology department and web technology and graphic design professor, uses AI to help prepare students for interviews. She’ll ask the AI chatbot to come up with questions that are being asked during interviews for entry-level graphic design positions.

Medical Laboratory Technology Assistant Professor Amanda Eney told fellow faculty members that she tells students to take material from class and use AI to generate quizzes to prep for exams in class.

“But again, she tells the students that AI can be wrong, so you need to double check it. AI can be a helper, but you're accountable ultimately for the outcome,” Clarke said.

On the opposite end of the spectrum, some professors do not want to see students use AI in classes or don’t feel that its use is applicable in their field. Clarke said writing professors have opted back to old blue books, which require students to write essays by hand during in-class exams.

One faculty member in particular was giving oral exams only, where students came into his office and answered questions one by one.

Rebecca Spear, an assistant professor of theater, used AI to generate exercises in class. Per previous discussions about ethics and AI, she made it known to the students.

“But then the students were really averse to doing the AI activities because there are concerns about AI and the environment. So, it's like all of us are learning to be very thoughtful users and thoughtful engagers,” Clarke said.

Ethical and practical questions about AI are explored in the current all-campus read, called “Co-Intelligence: Living and Working With AI” by Ethan Mollick. Faculty and students are reading the book that frames engaging with AI as if it were a collaborator rather than just a tool. Faculty and staff panels have discussed conversations about the book from a multidisciplinary perspective.

Most higher education institutions in Montana are prioritizing the introduction of AI for students.

The Future Project from the University of Montana launched in 2025 and aims to explore AI’s role in teaching, learning and work — “while keeping human values at the center,” according to the university’s website. The purpose was to develop the university’s AI commitments and coordinated approach, which incorporated community surveys, open forums and town hall discussions, as well as student and faculty feedback sessions.

Montana State University offers a graduate program in Artificial Intelligence. The 12-credit graduate certificate prepares students to incorporate artificial intelligence techniques when solving problems with computers.

Despite how quickly AI is advancing, educators are committed to facing the introduction of the new technology head on through teaching skills that employers will be seeking regardless.

“All employers probably want you to be able to critically think about how you engage with AI. We're definitely meeting community needs, ensuring student success and modeling lifelong learning,” Clarke said.


© 2026 the Daily Inter Lake (Kalispell, Mont.). Distributed by Tribune Content Agency, LLC.