IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Preparing K-12 and higher education IT leaders for the exponential era

Rochester Instructor Creates AI Learning Tool for Deaf Students

Grammar Laboratory, a new tool developed by an ASL instructor at Rochester Institute of Technology’s National Technical Institute for the Deaf, uses the live feedback capabilities of AI to personalize English lessons.

A student with a hearing aid faces a laptop computer. On the screen, a person in a brown shirt signs.
Shutterstock
Deaf and hard of hearing (DHH) students face unique challenges in English education. Some do not have access to spoken English or American Sign Language (ASL) inputs that promote language skill development early in life. According to the National Association of the Deaf, some parents of DHH students receive medical guidance not to use ASL, which, combined with an inability to hear verbal communication, can lead to a phenomenon called language deprivation.

To make grammar practice more accessible for these students, Erin Finton, a lecturer at Rochester Institute of Technology’s National Technical Institute for the Deaf (NTID), recently teamed up with Google to create an AI-powered English learning tool called Grammar Laboratory.

“There is a lot of research that’s happening about early intervention,” she said. “But there’s much less research on, 'If it has happened already, then what do we do to address it?'”

English instruction for deaf students is not standardized past the middle-school level, Finton said, and more advanced English education tends to focus on things like essay structure rather than grammar — a common struggle area for DHH students. Additionally, the learning materials provided to students don’t always work well for DHH learners.

“Even if they attempt to learn grammar, they’re given a textbook,” Finton said. “And if they’re given a textbook in English to learn English, it’s largely inaccessible.”

To better meet these distinct needs, Grammar Lab combines the personalization and live feedback capabilities of AI with alternative materials made by human experts. Finton created videos of herself signing an explanation of English grammar rules, like when to use the articles “a,” “an” and “the.” The video explains not just what the rule is in English, but also how it relates to conventions of ASL, a distinction not always explored in English education, Finton said.

The tool then tests a student’s understanding of the grammar rule using AI-generated questions. Finton said the questions are an assignment rather than an assessment, and each one includes an “I’m not sure” option that prompts further explanation. Students can also use a chat box at the bottom to ask for clarification or help.

This option is especially important for DHH students, as some struggle with contextual understanding necessary to answer the question, rather than the grammar skill itself.

Finton said students have asked AI for context about topics that were unfamiliar to them — for example, hornets or the Hope Diamond — before answering grammar questions about them. She added that students who don't have hearing issues tend to pick up some knowledge about topics like these through incidental learning, absorbing information that is not necessarily directed at them.

“We hear the news in the background, or we hear people having a conversation,” she said. “There’s a lot of learning that happens just in the environment [that] naturally that deaf people miss out on.”

The Google AI underlying Grammar Lab is agentic and designed to adapt, according to Sam Sepah, who leads AI accessibility research programs at Google.

“The AI isn’t just necessarily responding to students' questions or prompts,” Sepah said in an interview via an ASL interpreter. “The AI agent in this case is actually really familiar with what the students are looking for in the specific context of learning English.”

Grammar Lab has been implemented in Finton’s own classroom as well as other English classes at NTID. Teachers receive analytics feedback on how students respond to different topics.

Finton said she and the Grammar Lab team are refining the AI model to build student profiles, tailoring the AI explanations to each student’s language proficiency.

Though created for a specific population, Grammar Labs’ approach could assist education throughout NTID and colleges across the U.S. and the world as a free, open-source solution. Finton said she has already received dozens of emails from instructors across the U.S. asking when it will be available.

“This is a proof of concept, that this works for deaf students,” she said. “I have much greater hopes that it will be used in many different languages, and can have a much wider impact.”
Abby Sourwine is a staff writer for the Center for Digital Education. She has a bachelor's degree in journalism from the University of Oregon and worked in local news before joining the e.Republic team. She is currently located in San Diego, California.