IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Preparing K-12 and higher education IT leaders for the exponential era

ISTELive 25: Strategies to Disarm Fears Over Implementing AI

At ISTELive 25 on Monday, technology leaders from a private boys’ school in New York City offered suggestions for engaging teachers to demystify AI and decide how to use it, including grade-by-grade ideas for K-8.

A person in a business suit with their head obscured by a dark cloud with "AI" in the middle of it in blue. Gray background.
Shutterstock
More than two and a half years after the launch of ChatGPT, many school districts have passed the discovery phase of generative artificial intelligence, learning what it is and what it might be able to do, and their implementation phase is in full swing. For those looking to demystify the technology and get buy-in from teachers and parents while they do this, school and technology leaders from Allen-Stevenson School in New York City have some advice: Emphasize professional development, participation and communication.

Co-leading a session Monday at the International Society for Technology in Education (ISTE) conference in San Antonio, the school’s Upper Division Technology Integrator Sam Carcamo opened with a summary of the hurdles that people in his position face. What continues to make AI scary to teachers and parents, he said, are fears about its apparent potential for bias, hallucinations, eroding critical thinking, devaluing expertise, dehumanization and cheating, cheating, cheating.

His colleague Sarah Kresberg, director of library services and educational technology, said even at a well-funded private K-8 boys’ school like Allen-Stevenson, teachers weren’t sure what to do with ChatGPT at first. Hoping to find who the early adopters were, they started an AI club for teachers and administrators, which eventually became an AI council. For the 2023 school year, members ran trials of seven different AI tools, and eventually, “after a lot of hand-wringing,” Kresberg said, they settled on SchoolAI for two main reasons: They wanted something that K-8 students could interface with directly at their age, and they liked SchoolAI’s privacy policy.

The school’s Technology Integrator Ainsley Messina said they then developed an eight- to 10-hour professional development course.

“Our goal was to establish a shared vocabulary among our faculty no matter their comfort level with AI, whether they’d used it before or had not,” she said. “Throughout this PD course, we talked about AI literacy, AI ethics, we talked about bias that exists in AI, and we were talking about, ‘What should some of our concerns as educators be with AI coming our way? How do we need to change the way we teach in order for our students to be successful when they go on to our next schools?’”

Carcamo said the training involved having teachers fill out forms on what they were interested in, for future reference — report-card writing, unit planning, lesson planning or gamification. By fall 2024, Kresberg said, they convened again for roundtable discussions by topic. Each teacher had done a deep dive into a particular topic, so those who participated got together and discussed their thoughts and findings on it in depth, which were then shared with the larger group.

Carcamo, Kresberg and Messina said what came out of those discussions was a rough outline of what to do, or what they did, with AI in each grade:

  • Kindergarten to first grade: Teach students to recognize the difference between artificial and natural creations, and understand that AI is when people make machines act smartly.
  • Second grade: Start introducing the concept of generative AI and using SchoolAI.
  • Third grade: Have students chat with historical figures about their greatest achievements and what challenges they overcame, and gather information to write a three-paragraph essay about their historical figure.
  • Fourth grade: Go deeper into AI literacy. Use Common Sense Media, talk about bias in AI, how it can impact lives, how AI is trained and how it works. Use Google Teachable Machine, which uploads data to an AI to train it to do certain tasks.
  • Fifth grade: Using a giant one-paragraph mega-prompt crafted by a teacher, students were able to ask questions of an AI version of Marcus Aurelius. They were then assigned to use Canva to create comic books based on historical stories.
  • Sixth grade: Have students talk with an AI chatbot about the various rocks and minerals they were assigned, then use Adobe Express and generative AI to create geology trading cards. As other examples, sixth-grade English students were asked to describe locations and then fed those descriptions to an image generator to see what it would come up with. If students didn’t like what they saw, they would refine their description to get the AI closer to what they intended. Sixth-grade Spanish students were asked to write a story of an angel and a devil trying to convince a character to be naughty or nice, use Adobe Express to make an animated book, record their own audio for it and sync it to an animated mouth.
  • Seventh grade: Use ChatGPT to get detailed feedback on essays, and use Newsela to get writing feedback on pre-test assignments. They started a practice of allowing students to use Newsela to get feedback on every essay before turning it in. According to Kresberg, most students used it and said they found its labeling of paragraph parts very helpful, but she emphasized that students need a lot of practice writing.

Kresberg said the school initially put off talking to parents, but eventually it started a series of five parent engagement meetings throughout the year called “Tech Tuesdays,” about an hour each. In the fall, sessions were about how the school was using AI, and by spring they were covering how parents could use AI at home to help their children learn and to become better at executive functioning.

For technology integrators or anyone working with teachers, Messina recommended AI for Education’s six-week AI literacy trainer course, as well as the Women in AI and Education community on Slack.

Carcamo said that as staff started working on projects, that by itself started to raise interest among their colleagues.

The school, Kresberg said, has yet to make AI a mandate, but that hasn’t been a problem in their case.

“Obviously not everyone is using a lot of AI right now. We’re not mandating that anyone use it. We’re encouraging and facilitating people to use it. I’m not sure that we’re anywhere close to saying people have to use it for anything right now, but we don’t have any real naysayers either,” she said. “I know in some schools there are people who are making things difficult for the ones who want to use it, and luckily, we don’t have that. If people aren’t on board, they’re very quiet about it.”
Andrew Westrope is managing editor of the Center for Digital Education. Before that, he was a staff writer for Government Technology, and previously was a reporter and editor at community newspapers. He has a bachelor’s degree in physiology from Michigan State University and lives in Northern California.