IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Preparing K-12 and higher education IT leaders for the exponential era

ISTELive 25: Strategies for Teaching Teachers About AI

As the fast progression of AI raises both the stakes and urgency of professional development for teachers, education instructors have shared thoughts on what works — and what doesn't — to get them up to speed.

A graphic of people at a presentation.
Adobe Stock
The demand for AI competency is growing fast across many industries, but nowhere faster than in teaching, some experts say, because educators must lay the groundwork of professional knowledge for every other sector. At the ISTELive 25 conference in San Antonio last week, a panel of professors and consultants said professional development (PD) for teachers should include knowledge of AI content, technology and pedagogy, as well as specific examples for implementation and familiarity with related research.

Moderating the discussion, Nancye Blair Black, CEO of the educational consulting firm The Block Uncarved, said she was part of the ISTE AI in Education Preparation Program that collected ideas on this topic from various universities. The group realized that what teacher training programs most needed to prioritize fell into three overlapping categories that aligned with the TPACK (Technological Pedagogical Content Knowledge) framework, an educational model popularized in the 2000s: content knowledge, such as AI literacy and learning about AI; technical knowledge, including AI fluency and knowing how to use the tools; and pedagogical knowledge, which is understanding the teacher’s responsibilities and how to adjust pedagogical practices accordingly.

Stacy George, an assistant professor from the University of Hawaii at Manoa, described the ideal approach to AI in teacher training as being “a cautious advocate with a moral compass.”

CONTENT KNOWLEDGE


Amy Eguchi, an associate teaching professor from the University of California, San Diego, said teachers must learn how AI — and not just generative AI — works, which means their training must include elements of computer science.

To do this, Eguchi recommended AI4K12.org, which outlines a handful of big ideas in AI, and stressed that AI literacy for primary and secondary educators involves teaching them to engage with, create with, manage and design AI.


TECHNOLOGICAL KNOWLEDGE


On the subject of technological knowledge, Black emphasized the importance of making sure preservice and in-service teachers are proficient not just with their own personal uses of AI, but specifically with the tools they’re likely to use in class. And that’s likely to mean different lessons for teachers in different grade levels.

“It could be that you’re in the elementary level, and your students are using i-Ready or Khan Academy, and AI is doing that personalized learning and adaptive assessment. Then [the teachers] need to understand how that tool works and how to use it safely, ethically and proficiently,” she said. “Similarly, they might need strategies for effectively and efficiently reviewing the transcripts of students’ conversations with chatbots. A lot of people, especially at the middle and senior high level, are now bringing AI tutors in, but it is the burden of the teacher, the responsible AI piece, to review those conversations to make sure the content is right. We have to teach that skill.”

Black added that most teachers will need to know how to use Teachable Machine, a web-based tool for creating machine learning models, and may need new key skills such as prompt engineering or problem formulation.

PEDAGOGICAL KNOWLEDGE


George said that getting teachers to update their pedagogical approach to incorporate AI will start with professional development. She cited research showing that teachers often take whatever instructional models they experienced in PD into their own classrooms.

“There’s research that supports this adage that we teach how we were taught, but AI has only recently filtered into our educational systems, and it’s transforming our society,” she said. “Our preservice teachers are going to become leaders in that classroom one day. So AI is getting our preservice teachers to not just be consumers of AI, it’s really getting them to think and use AI ethically and effectively.”

George said that might take a little reframing, illustrating how new and improved practices may involve new ethical considerations, such as data privacy, and new potential risks, such as cognitive decline.

INFUSE AI COMPETENCIES INTO TEACHER PREP


Black then moved on to seven critical strategies developed by the ISTE AI in Education Preparation Program to guide training teachers on AI:
  • Foster a universal foundational understanding of AI.
  • Teach them skills for effectively harnessing AI tools for instruction.
  • For AI literacy education, use national frameworks such as the Five Big Ideas in AI.
  • Have them test and explore AI tools in ways that develop and apply their knowledge.
  • Infuse AI literacy across existing curricula.
  • Include critical examinations of AI tools in both K-12 classroom experiences and teacher preparation.
  • Intentionally include the above in teacher preparation.

Offering examples of how she does this, Longwood University assistant professor Alecia Blackwood said she starts college freshmen on basic AI literacy and ethics, proceeds with juniors on AI in disciplinary literacy and ethics, and finally teaches seniors about AI for instructional design, creating ethical guidelines, and using specific tools and building AI chatbots.

For course-level syllabus integration, Sue Kasun, a member of the education faculty at Georgia State University, recommended the GAI2N GenAI Integration Navigator, a 28-page set of guidelines for deciding whether, when and how to integrate GenAI into a course.

Camille Dempsey, an education technology professor at Pennsylvania Western University, stressed the importance of institutional movement and building a culture of AI readiness. She said this happens through one-on-one interactions and not being afraid of difficult conversations.

“I find myself telling a lot of stories, which I think is another great strategy — not pushing people into this, but maybe inviting them to see what kinds of things we’re all doing,” she said. “I also thought it was pretty important to get our students involved, so we started an AI ambassador program … and I took everyone that applied. There were 36 students — undergrad, graduate and doctoral students, we had the whole range. Those students now … are on the schedule for this fall, and there will be some next spring, to teach some of the professional development to faculty as well as other students on their perceptions of what they’re learning about AI.”

OVERRELIANCE AND COGNITIVE DECLINE


In closing, Black cited recent MIT research showing that an overreliance on AI, especially in young people, can negatively affect memory and cognition.

“We have to somehow combat that, and we need teachers to have their minds on,” she said. “There’s also research coming out that’s saying, ‘But when the AI is actually a thought partner that’s giving feedback and prompting reflection, learning increases.’ So it’s really important that we teach these tools in ways that are actually beneficial to students.”
Andrew Westrope is managing editor of the Center for Digital Education. Before that, he was a staff writer for Government Technology, and previously was a reporter and editor at community newspapers. He has a bachelor’s degree in physiology from Michigan State University and lives in Northern California.