In a case study report of the Edtech Equity Project last month, Aspen Policy Academy Fellows Nidhi Hebbar and Madison Jacobs said the initiative helps schools and ed-tech companies alike confront racial bias baked into educational tools. To do this, they developed three targeted resources to combat the issue: for ed-tech companies, an AI in Education Toolkit for Racial Equity that includes criteria they should consider in their design process, and a certification seal through the nonprofit Digital Promise for companies that meet them; and for procurement leaders, a School Procurement Guide with questions for digging deeper into the equity practices of ed-tech tools before they are implemented on campus.
In a document on policy guidance published in 2024, Hebbar and Jacobs noted disparities that exist for Black and brown students in the education system, including more frequent suspensions, being grouped into less rigorous academic tracks and the school-to-prison pipeline. A documented race gap in attendance, discipline records, grades and test scores could all influence AI algorithms trained on the data, they said.
“Without rigorous and intentional oversight, ed-tech products that use AI and machine learning will amplify the existing racial biases already present in the data within our school systems and introduce new biases through assumptions in algorithmic design,” according to the policy guidance document.
Currently, no laws require ed-tech providers to disclose bias in their AI, according to the Edtech Equity Project’s website. This leaves it up to schools to ask for transparency and fairness, and to ed-tech vendors to be proactive in working against these biases.
A TOOLKIT, A CERTIFICATION AND A PROCUREMENT GUIDE
To help with this, the project’s first major output was the AI in Education Toolkit for Racial Equity. In the early ideation stage, the toolkit prompts product teams to clearly define their AI tool's goals and values, assess whether AI is the appropriate solution for the problem they're trying to solve, and consider risks to racial equity if the algorithm misfires. The toolkit then advises teams to consider logical assumptions and data sets behind their idea to catch early biases, and ensure the tool is responsive to students with different backgrounds — a practice that is tested in training algorithms. Careful consideration like this runs all the way through implementation, the toolkit says, ensuring teachers are trained on equitable use.
“In an ideal world, companies would embed the activities and practices in our toolkit into their standard development processes,” Hebbar said in a video.
In late 2021, the project partnered with nonprofit Digital Promise to create a product certification for tools that “center racial equity throughout design and development.” Built on the same criteria in the toolkit, the Prioritizing Racial Equity in AI Design certification provides an industry standard for equitable design. The application lists five key qualifications, including at least two bias-mitigation practices and the option to override a product’s decisions for an individual learner.
For K-12 districts evaluating software tools, another resource provides audit questionnaires and a rubric to assess vendors on equity criteria. The School Procurement Guide includes questions on how a product was built and tested, how it will use data, and how racial information informs that data.
“By asking the right questions during procurement, your school can become a powerful partner in advocating for equitable design and development of the products they use,” the guide says.