IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Technology, Engineering Literacy Report Card Measures Student Performance in 6 Categories

The National Assessment of Educational Progress' unique assessment uses scenario-based tasks on computers to measure how eighth-grade students understand the use and effect of technology in their personal lives.

In our technology-driven world, it’s not uncommon to see a child — even a toddler — staring mesmerized into the screen of a computer or mobile device. But how does that technology use impact learning and real-world skills? We now have more insights into answering that question thanks to the first-ever Nation’s Report Card for Technology and Engineering Literacy assessment, which released results May 17 on how students solve real-world problems using digital tools.

The National Assessment of Educational Progress (NAEP) rolled out the unique assessment in 2014 using scenario-based tasks on computers to measure how eighth-grade students understand the use and effect of technology in their personal lives. Approximately 21,500 eighth-grade students participated in the assessment from more than 800 public and private schools across the nation.

Here’s how it worked: Students were asked to complete scenario-based tasks on a laptop computer to solve realistic problems. As part of the assessment, students interacted with multimedia tools and digital narrators to collect information and arrive at solutions. For example, an “Iguana Home” task asked students to find a way to improve the habitat for a classroom iguana. Throughout the task, students watched a video to learn about iguanas and their needs, and were then prompted to design and test some solutions to improve the iguana’s home.

The assessment was carefully designed around a framework to measure student performance in six subcategories, including technology, design and systems, information and communication technology, understanding technological principles, developing solutions/achieving goals, and communicating and collaborating. Students also answered questions about their personal knowledge and use of technology in a questionnaire. In this way, researchers can not only understand if students performed well, but in what specific ways they performed well.

While the comprehensive data is still being analyzed, some key takeaways from the results include:

  • 43 percent of eighth-graders performed at or above proficient level
  • 52 percent of students reported taking at least one course related to technology and society in school
  • 63 percent of students reported that their families most often taught them about fixing/building things or how things work
  • 50 percent of students reported using a computer to create, edit or organize digital media at least once a month at school.
There were also some noteworthy — and unexpected — results. On average, female students performed better than male students, especially in the areas of developing solutions and collaborating. What’s more, white females scored four points higher overall than white males, and black females scored five points higher overall than black males.

Peggy Carr, acting commissioner with the National Center for Education Statistics, was involved in designing the assessment and reviewing the data. As she points out, these results were surprising in that they don’t align with traditional math/science testing results.

“We did not expect this pattern. It seems pretty clear from the data that girls have the ability to be successful in technology and engineering, and that is worth noting,” she said. “For gender, it was noteworthy because we’re seeing it in multiple domains and practices.”

Trends in ethnicity were also found, as white/Asian students scored higher than black and Hispanic students. Further, students from city schools scored lower than their peers in suburban and rural schools. Bill Bushaw, executive director of the National Assessment Governing Board, notes that these findings are helpful in showing the importance that learning opportunities can have for students.

“Students told us these were related to what they learned in and out of school,” Bushaw said. “It comes back to opportunities and how different students for various reasons have different opportunities to learn, and the results highlight the profusion of opportunities for some students and the lack of opportunities for other students.”

Since designing and launching complex assessments like this requires in-depth collaboration and review from educators, experts and teachers, the next assessment is not expected until 2018 and will likely focus on social studies. Carr foresees, however, that this design and style of assessment will be more widely used in the future to better measure student performance.

“It affords us an opportunity to have a digital footprint to see how students process information. We’re able to track timing, speed, patterns, processing, how students are clicking through the scenario. We are going to learn a lot,” she said. “Also, just problem-solving itself speaks to a lot of issues of troubleshooting. While we’ve always had practices of these constructs, this is the first time we’ve developed subscales for them and how to think about how students approach these tasks.”

Bushaw adds that assessments of this nature provide insights into where learning occurs for students in the modern world.

“Our hope now is to show progress,” he said. “There are clearly opportunities to learn in and outside of school. The potential for those two areas to work together will be most helpful in achievement.”