IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Preparing K-12 and higher education IT leaders for the exponential era

Academic Technologist: AI Calls for Small Teams, Experimentation

Former University of Pennsylvania tech leader Rob Nelson said small, cross-functional teams, local experimentation and faculty-centered support can unlock meaningful innovation in applications of AI.

AI Insights: A detailed illustration of two professionals analyzing a complex artificial intelligence structure, surrounded by analytical symbols and tools
Adobe Stock
As institutions continue to integrate generative AI into their teaching and operations, the hype surrounding tools like ChatGPT is dominant, but academic technologist Rob Nelson urges higher-ed leaders to resist the temptation to scale too quickly. Instead, he said, focus on small, structured experimentation and collaboration between technologists and educators.


Nelson served as the executive director for academic technology and planning at the University of Pennsylvania for six years and now writes a blog on technology and education with more than 1,000 subscribers. He said that for the last 10 years, the work of education technology leaders has often been pursuing the best enterprise tools to meet campus needs.

“Generative AI is a digital tool, but it’s also something different,” he said. “It does something else, probably a lot more than we can even think of or imagine using it for now. And so that difference, for me, really argues for taking a more incremental and experimental approach to how we implement it.”

BEYOND CHATBOTS: BUILDING COLLABORATIVE, LOCAL CAPACITY


For example, Babson College’s The Generator, an AI-focused innovation lab and teaching center, is designed to bring faculty and technologists together. The center hosts eight specialty labs for AI experimentation in different focus areas, as well as coordinating more broad meetup events, surveys on AI usage and collaborations with companies like Microsoft. At Babson, more than half of faculty have been peer-trained on AI tools through the internally developed teacher training program.

Nelson contrasted these more incremental strategies with approaches like the California State University system’s $16.9 million contract to provide systemwide access to tools like ChatGPT. The CSU AI Commons houses resources for students, faculty, staff and alumni, including online training courses and information on the recent Artificial Intelligence Educational Innovations Challenge.

When the top-down plan rolled out in February, “there just wasn’t much in the way of guidance or a framework or training around this. They put up some videos from LinkedIn, they pointed people to OpenAI’s website, and said, ‘All right, go learn,’” Nelson said. “That just doesn’t seem like it yielded much in the way of anything more than was already in place, which is a lot of students trying out ChatGPT to do education-related tasks, and a lot of faculty members going, ‘Wait a minute. What’s going on here?’”

STRUCTURED EXPERIMENTATION WITH INSTRUCTIONAL DESIGNERS


In his capacity as a teacher at UPenn, Nelson worked with the Penn Center for Learning Analytics to incorporate a large language model in his classroom as a teaching assistant. This collaboration emerged not from a centralized program, but “knowing a guy who knows a guy,” Nelson said. Still, the outcome was a customized AI model aligned with his pedagogical goals.

This fall, Nelson plans to expand that approach in a new course in which students will build their own chatbots in sandbox environments to better understand how they work. Finding a sandbox environment that allows students to explore chatbots securely requires some technical knowledge, and knowledge of the course goals.

As faculty continue to experiment, instructional designers who sit at the intersection of pedagogy and technology are an important link, Nelson said.

“With that gap between the people who understand what the technology can do and the people who understand the classroom, you want people who can represent both those views and can translate between them,” he said.

Having dedicated spaces, like The Generator at Babson, where experts like this are easily found and reached, can help.

TIME, NOT TECH, IS THE REAL BARRIER


Nelson said despite the potential of AI in education, the most persistent barrier isn’t skepticism or lack of infrastructure, but time. Instructors are expected to teach and navigate bureaucratic processes, including adapting to changing funding structures and technologies. For some instructors, learning about AI can feel like another obligation.

Encouraging instructors to experiment on a small scale with projects that they would like to implement can help lessen the burden, like one physics course at John Abbott College in Montreal that refined a large language model to suit pedagogical needs.

“There’s this expectation that professors must learn everything about AI in order to teach it,” Nelson said. “I just think that's got it reversed.”

Teachers should be encouraged to be curious, he said, learning from those around them, including their students.

“We don't know enough about this technology to know what it can do,” he said. “The only way we're going to find out is by using it and trying it out and seeing how things go.”
Abby Sourwine is a staff writer for the Center for Digital Education. She has a bachelor's degree in journalism from the University of Oregon and worked in local news before joining the e.Republic team. She is currently located in San Diego, California.