behind the scenes

What a day is like inside Pearson’s test scoring facility

Facing widespread backlash after years of controversies and testing glitches, one of the world’s largest testing companies is taking an unusual approach to quieting critics: It’s opening its doors.

Pearson, a British-based testing conglomerate that recently signed on for a two-year contract to aid in writing and administering Indiana’s ISTEP test, today invited Indianapolis reporters to its north side scoring facility in an effort to reveal how the company hand-scores questions on hundreds of thousands of student exams.

It’s part of a charm offensive from a company that depends on public contracts but is often at the center of public debate over testing in schools.

“We don’t see any reason why we shouldn’t pull back the curtain,” said Scott Overland, Pearson’s new director of media.

With parents and educators often skeptical about how exams are made and scored at a time when test scores can influence everything from teacher bonuses to whether schools are allowed to continue to operate, Pearson is hoping to allay fears by better explaining its work.

“We completely understand how this whole assessment process, the knowledge that parents and educators and the public has about that is limited,” Overland said as he led reporters through Pearson’s scoring facility on the third floor of a mid-sized office building.

The tour featured a short walk through the floor, which consisted of three large rooms and several small offices and conference rooms. Pearson executives and officials in charge of the scoring process explained how scorers, who must have a four-year college degrees, are recruited by reaching out to retired teachers, tutors and other educators. They receive about seven hours of in-person or online training and most learn to score just one open-ended question — including essays and math problems that require students to show their work to demonstrate that they understand the concept.

Multiple choice questions are graded by machine at Pearson’s main scoring facility in Iowa.

Pearson execs on the tour showed reporters how scorers log onto a computer platform where they see scanned images of student papers they must assess and grade. Their work is tested by “validity” questions that supervisors use to test their scorers for accuracy.

Scoring supervisors sit quietly at rows of tables in front of boxy computers in the scoring center. They’re in regular communication with the scorers themselves, who typically work from home across Indiana. Because scorers and supervisors don’t necessarily work regular business hours, many tables were sparsely filled Thursday morning.

Allison Tucker, a scoring supervisor for fourth-grade reading who’s been working with Pearson for more than 10 years, said one of her graders might do 70 questions in an hour. If a scorer gets off track and starts grading incorrectly, Tucker said that’s where the supervisors can step in.

“That’s one of the things that we really take seriously,” Tucker said. “So far it hasn’t been a problem for us.”

Few businesses in the education world are under as much day-to-day scrutiny as testing giants like Pearson, since just a handful of companies compete for lucrative state testing contracts and the chance to sell associated test prep materials for those exams.

Pearson is the largest education company in the world and a leader in standardized test development, having nabbed a contract for the multistate, Common Core-linked PARCC exam and one to handle the scoring for the National Assessment of Educational Progress (NAEP).

Yet it’s an industry frequently under fire when errors are discovered among millions of test questions or when problems arise with scoring or computer testing platforms. Every few weeks during standardized testing season, critics can seize on headlines reporting computer malfunctions or other testing disruptions.

Just yesterday, an employee error caused widespread test cancellation of New Jersey’s PARCC exam.

The problems aren’t limited to Pearson. Indiana’s 2015 ISTEP test, which was haunted by glitches and scoring delays was administered by California-based CTB, a Pearson competitor. CTB also ran into problems in 2013 when about 78,000 Indiana students taking the test on computers were interrupted over the course of several days — an error that forced CTB to pay $13 million in damages to the state.

Indiana then dumped CTB and hired Pearson last year with a more than $30 million contract to administer the 2016 and 2017 ISTEP exams, but the state is now looking to create yet another new exam for 2018.

The new exam will surely generate another a sought-after testing contract. So Pearson could be treating the ISTEP as something of an audition, trying to make a good impression in hopes of ongoing work.

“We recognize very much that this is critically important work we are doing,” said Melodie Jurgens, who oversees the general scoring process. “Our scorers are quite passionate, and they care a lot about how students do. They want to get it right because they know it’s important.”

Indiana is one of the first states where Pearson has invited reporters to tour its facilities, though earlier this week Overland said some national news outlets were given tours of the Iowa facility. The company hasn’t used such strategies in the past, he said, but plans to open up tours in other states going forward.

Granting this level of access to reporters isn’t a common move for testing companies, said Bob Schaeffer, spokesman for The National Center for Fair and Open Testing, an organization that acts as a testing watchdog. He said he’d been contacted by another reporter about a similar tour this past week but had never heard of this approach before.

But given the challenges Pearson has faced recently — including the loss of three major testing contracts in Florida, Texas and New York — it’s not necessarily a surprise.

“All the major testing companies have had computer testing failures,” Schaeffer said. “It shows an incredible pattern of technological failure that is more than the isolated glitch that they like to make it seem.”

Since Indiana switched to Pearson this year, things have gone relatively smoothly. The state officially started its second round of 2016 ISTEP tests this week, and few problems have been reported.

But Schaeffer said Indiana has “jumped from the frying pan into the incinerator” by making its test vendor switch.

“It’s a perverse game of musical chairs in which a state might reject a contract with a vendor for doing a bad job and hires a new vendor who has time available because they just got fired from another job,” Schaeffer said.

testing testing

McQueen declares online practice test of TNReady a success

PHOTO: Manuel Breva Colmeiro/Getty Images

Tennessee’s computer testing platform held steady Tuesday as thousands of students logged on to test the test that lumbered through fits and starts last spring.

Hours after completing the 40-minute simulation with the help of more than a third of the state’s school districts, Education Commissioner Candice McQueen declared the practice run a success.

“We saw what we expected to see: a high volume of students are able to be on the testing platform simultaneously, and they are able to log on and submit practice tests in an overlapping way across Tennessee’s two time zones,” McQueen wrote district superintendents in a celebratory email.

McQueen ordered the “verification test” as a precaution to ensure that Questar, the state’s testing company, had fixed the bugs that contributed to widespread technical snafus and disruptions in April.

The spot check also allowed students to gain experience with the online platform and TNReady content.

“Within the next week, the districts that participated will receive a score report for all students that took a practice test to provide some information about students’ performance that can help inform their teachers’ instruction,” McQueen wrote.

The mock test simulated real testing conditions that schools will face this school year, with students on Eastern Time submitting their exams while students on Central Time were logging on.

In all, about 50,000 students across 51 districts participated, far more than the 30,000 high schoolers who will take their exams online after Thanksgiving in this school year’s first round of TNReady testing. Another simulation is planned before April when the vast majority of testing begins both online and with paper materials.

McQueen said her department will gather feedback this week from districts that participated in the simulation.

testing 1-2-3

Tennessee students to test the test under reworked computer platform

PHOTO: Getty Images

About 45,000 students in a third of Tennessee districts will log on Tuesday for a 40-minute simulation to make sure the state’s testing company has worked the bugs out of its online platform.

That platform, called Nextera, was rife with glitches last spring, disrupting days of testing and mostly disqualifying the results from the state’s accountability systems for students, teachers, and schools.

This week’s simulation is designed to make sure those technical problems don’t happen again under Questar, which in June will finish out its contract to administer the state’s TNReady assessment.

Tuesday’s trial run will begin at 8:30 a.m. Central Time and 9 a.m. Eastern Time in participating schools statewide to simulate testing scheduled for Nov. 26-Dec. 14, when some high school students will take their TNReady exams. Another simulation is planned before spring testing begins in April on a much larger scale.

The simulation is expected to involve far more than the 30,000 students who will test in real life after Thanksgiving. It also will take into account that Tennessee is split into two time zones.

“We’re looking at a true simulation,” said Education Commissioner Candice McQueen, noting that students on Eastern Time will be submitting their trial test forms while students on Central Time are logging on to their computers and tablets.

The goal is to verify that Questar, which has struggled to deliver a clean TNReady administration the last two years, has fixed the online problems that caused headaches for students who tried unsuccessfully to log on or submit their end-of-course tests.


Here’s a list of everything that went wrong with TNReady testing in 2018


The two primary culprits were functions that Questar added after a successful administration of TNReady last fall but before spring testing began in April: 1) a text-to-speech tool that enabled students with special needs to receive audible instructions; and 2) coupling the test’s login system with a new system for teachers to build practice tests.

Because Questar made the changes without conferring with the state, the company breached its contract and was docked $2.5 million out of its $30 million agreement.

“At the end of the day, this is about vendor execution,” McQueen told members of the State Board of Education last week. “We feel like there was a readiness on the part of the department and the districts … but our vendor execution was poor.”

PHOTO: TN.gov
Education Commissioner Candice McQueen

She added: “That’s why we’re taking extra precautions to verify in real time, before the testing window, that things have actually been accomplished.”

By the year’s end, Tennessee plans to request proposals from other companies to take over its testing program beginning in the fall of 2019, with a contract likely to be awarded in April.

The administration of outgoing Gov. Bill Haslam has kept both of Tennessee’s top gubernatorial candidates — Democrat Karl Dean and Republican Bill Lee — in the loop about the process. Officials say they want to avoid the pitfalls that happened as the state raced to find a new vendor in 2014 after the legislature pulled the plug on participating in a multi-state testing consortium known as PARCC.


Why state lawmakers share the blame, too, for TNReady testing headaches


“We feel like, during the first RFP process, there was lots of content expertise, meaning people who understood math and English language arts,” McQueen said. “But the need to have folks that understand assessment deeply as well as the technical side of assessment was potentially missing.”