IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

New Hampshire Redefines How Student Performance Is Measured

Educators around the country are watching to see whether PACE is successful in providing competency-based education. A key factor is the assessment data.

Launched in 2015, PACE is an accountability pilot approved by the U.S. Department of Education and designed by New Hampshire to drive deeper learning for students and organizational change for schools and districts.
Sanborn Regional School District, serving two small towns in southern New Hampshire, might be modest in size, but it’s participating in a very big project that could impact students, teachers and schools across the country. Sanborn is part of Performance Assessment of Competency Education (PACE), a two-year, state-run pilot project that looks at a broader range of measures for student learning than the traditional standardized testing that has been used for years. 

Besides Sanborn, eight other New Hampshire school districts are part of PACE, including Rochester, Epping, Souhegan, Monroe, Concord, Seacoast Charter School, Pittsfield and SAU 35. More than 15 additional districts across the state are in the process of planning and preparing to implement the pilot as well. Educators from around the country are watching the project closely to see whether PACE is successful in terms of providing a competency-based education approach that allows students to learn and demonstrate critical knowledge skills.
Launched in 2015, PACE is an accountability pilot approved by the U.S. Department of Education and designed by the New Hampshire Department of Education to drive deeper learning for students and organizational change for schools and districts. According to the department, the new accountability model was developed in part because “New Hampshire’s educational leaders recognize that the level of improvement required cannot occur with the same type of externally oriented accountability model that has been employed for the past 12 years.” 
Instead of traditional multiple-choice testing, teachers evaluate students based on performance assessments that are relevant to current learning and can demonstrate competency levels. There are currently 17 core performance assessments that all participating PACE school districts have created and use to measure students. 
What does a performance assessment look like for students? It can take many forms, including quizzes, writing assignments, projects or presentations. For example, students in Sanborn were asked to create a solar cooker as part of a project demonstrating key science concepts that they were learning at the time.
Many in the education sector are watching PACE closely to see if it has the potential to be implemented throughout New Hampshire and possibly other states across the nation. In fact, the Every Student Succeeds Act (ESSA) now includes the “Assessment and Accountability Demonstration Authority,” which allows states to pilot innovative ways to assess students. Before any official rollouts of the PACE model can happen, however, the pilot must prove that locally developed performance assessments can be used to compare districts. 

Developing Assessment Metrics

While teachers assess students in the PACE model, they have to follow established processes to ensure quality and fairness. From the beginning, teachers across the PACE districts work with a coordinator of curriculum, instruction and assessment along with 30 teacher content leads who are trained to develop complex performance tasks to ensure that the assessments align with PACE standards and guidelines.  
The New Hampshire districts have also been establishing quality and technical standards for their efforts with the help of the Center for Assessment, a nonprofit organization that offers consulting services related to assessment and accountability. With headquarters in New Hampshire, the organization was established in 1998 and has a long history of working with states and school districts to address changes in accountability and assessment in the United States. As part of that collaboration, the center develops metrics to ensure that all assessments align with agreed-upon performance standards. 
During the actual evaluation process, teachers come together as a group to discuss, evaluate and grade student work via a rubric system. All work is cross-evaluated to ensure fairness and consistency. To protect student privacy, all names and identifying information are removed from student work prior to evaluations and are replaced with an ID number that the New Hampshire Department of Education correlates to names later on. 
Those scores are uploaded to PerformancePLUS, a suite of modules that assist New Hampshire school districts in managing and organizing student data, creating local assessments, and mapping curriculum. The Center for Assessment can access the scores to look for variances across districts and adjust performance standards, if necessary (i.e., if there is evidence that a teacher is more lenient or stringent than others). As a way to collect comparative data, students in grades three, four and eight still take tests from the Smarter Balanced Assessment Consortium that 15 states support, and 11th-graders take the SAT. 
To further enhance consistency within schools and across districts during evaluations, participating school districts must meet certain criteria. As noted in a recent progress report to the United States Department of Education, the New Hampshire Department of Education set a target for evaluator consistency of 60 percent for each part of multidimensional tasks. The target is based on information from the National Assessment of Educational Progress (NAEP), a federal program designed to measure what students know and can do in various subjects. Furthermore, the state established a target of 54 percent for cross-district evaluation comparability.
Student performance assessment data is accessible by the state, school districts and Center for Assessment. Parents also receive reports on their students’ performance in math, science or language arts, much like they would for a traditional standardized test. 
While the use of PerformancePLUS has been an adequate tool for the pilot, it’s also revealed a need for a more comprehensive technology solution, according to Scott Marion, executive director of the Center for Assessment. Currently, many applications do just parts of what they need done. For example, LibGuides allows them to store and securely share implementation documents, including a task administration guide, monthly meeting minutes, data collection protocols and resources. It also serves as a repository for operational tasks to support assessment needs in PACE districts. But during task development, teachers use Google Docs to collaborate. 
“That is one of the major challenges: to find a solution that does it all,” said Marion. “To make it even more complicated, one that integrates with a variety of home systems in different districts [such as a student management system or learning management system]. That’s been a real challenge for us.”
Since a solution has not been found that meets all of the needs in terms of task development, data collection, performance task scoring and calibration, the state is aware that it may need to build its own system and is considering partnering with a technology company to design a customized solution.  

The Key to Success: Retaining Knowledge

Looking at the data results from the first two years of the pilot, it appears that PACE scores are comparable to Smarter Balanced scores. While additional analysis will give deeper insights, the scores validate the personalized, competency-based approach as a measure of student performance within schools and on a statewide basis. 
In fact, the most recent results show a significant uptick in student scores after PACE implementation. For example, eighth-grade English language arts results from 2015 show a 58 percent proficiency rating for Smarter Balanced and a 48 percent proficiency rating for the PACE group mean. Those scores shot up in 2016 to 62 percent for Smarter Balanced and 53 percent for the PACE group mean. Similar findings were true for 2016 eighth-grade math results.
While these results are encouraging, researchers at the Center for Assessment are clear that further data is needed to determine true, sustainable impact. 
Brian Blake, superintendent at Sanborn Regional School District, notes that the scores point more to the rigor of the assessments than students’ abilities. “This is a lot harder than memorizing facts,” he said. “I think that if you ask our teachers, or our parents, they would tell you that students are learning at a deeper level than they were before.” He also pointed out that performance assessments have opened the door for some students to improve their evaluation scores by being able to demonstrate what they know, rather than relying on being a good test-taker. 
That demonstration and retention of knowledge is precisely why organizations like the National Education Association strongly support the PACE model as a modern approach to assessments. Donna Harris-Aikens, the association’s director of education policy and practice, foresees this as the future of student assessments. “People are rapidly coming to the conclusion that the kinds of assessments that students are taking matter,” she said. “If it doesn’t promote learning, you need to ask the question about why you’re trying to give that assessment. Figure out where students’ strengths are. The closer you can get to assessments supporting student learning, the more often these assessments look like projects or students playing a musical score.” 
Harris also points to the limited information that standardized testing reveals. “Standardized testing provides some data that is useful trend data, but it doesn’t provide information that can be used in individual schools or classrooms about how to help individual students succeed.” 
While the future of assessments across the nation is still uncertain, Paul Leather, deputy commissioner for the New Hampshire Department of Education, notes that the PACE model is swiftly being implemented across the state. “We are rolling the process out as we speak through a multitiered system, by which districts must prove that they are ready to implement the system, as they have the appropriate leadership, training, resources for implementation, and agree to participate in the group activities that drive PACE,” he explained in an email. “Typically, they go through a couple years of preparation work in several tiers of support before implementing.”
Leather notes that while there are still areas to improve, the PACE pilot has strongly proven to stand up as an accountability system so far. “We need to do more to shore up college and career-ready math performance in our high schools — this is a No. 1 priority. We believe that this system provides much in terms of supporting schools and educators, in terms of understanding rigorous expectations for students in new ways, and in investment in their students’ success.”