behind the scenes

What a day is like inside Pearson’s test scoring facility

Facing widespread backlash after years of controversies and testing glitches, one of the world’s largest testing companies is taking an unusual approach to quieting critics: It’s opening its doors.

Pearson, a British-based testing conglomerate that recently signed on for a two-year contract to aid in writing and administering Indiana’s ISTEP test, today invited Indianapolis reporters to its north side scoring facility in an effort to reveal how the company hand-scores questions on hundreds of thousands of student exams.

It’s part of a charm offensive from a company that depends on public contracts but is often at the center of public debate over testing in schools.

“We don’t see any reason why we shouldn’t pull back the curtain,” said Scott Overland, Pearson’s new director of media.

With parents and educators often skeptical about how exams are made and scored at a time when test scores can influence everything from teacher bonuses to whether schools are allowed to continue to operate, Pearson is hoping to allay fears by better explaining its work.

“We completely understand how this whole assessment process, the knowledge that parents and educators and the public has about that is limited,” Overland said as he led reporters through Pearson’s scoring facility on the third floor of a mid-sized office building.

The tour featured a short walk through the floor, which consisted of three large rooms and several small offices and conference rooms. Pearson executives and officials in charge of the scoring process explained how scorers, who must have a four-year college degrees, are recruited by reaching out to retired teachers, tutors and other educators. They receive about seven hours of in-person or online training and most learn to score just one open-ended question — including essays and math problems that require students to show their work to demonstrate that they understand the concept.

Multiple choice questions are graded by machine at Pearson’s main scoring facility in Iowa.

Pearson execs on the tour showed reporters how scorers log onto a computer platform where they see scanned images of student papers they must assess and grade. Their work is tested by “validity” questions that supervisors use to test their scorers for accuracy.

Scoring supervisors sit quietly at rows of tables in front of boxy computers in the scoring center. They’re in regular communication with the scorers themselves, who typically work from home across Indiana. Because scorers and supervisors don’t necessarily work regular business hours, many tables were sparsely filled Thursday morning.

Allison Tucker, a scoring supervisor for fourth-grade reading who’s been working with Pearson for more than 10 years, said one of her graders might do 70 questions in an hour. If a scorer gets off track and starts grading incorrectly, Tucker said that’s where the supervisors can step in.

“That’s one of the things that we really take seriously,” Tucker said. “So far it hasn’t been a problem for us.”

Few businesses in the education world are under as much day-to-day scrutiny as testing giants like Pearson, since just a handful of companies compete for lucrative state testing contracts and the chance to sell associated test prep materials for those exams.

Pearson is the largest education company in the world and a leader in standardized test development, having nabbed a contract for the multistate, Common Core-linked PARCC exam and one to handle the scoring for the National Assessment of Educational Progress (NAEP).

Yet it’s an industry frequently under fire when errors are discovered among millions of test questions or when problems arise with scoring or computer testing platforms. Every few weeks during standardized testing season, critics can seize on headlines reporting computer malfunctions or other testing disruptions.

Just yesterday, an employee error caused widespread test cancellation of New Jersey’s PARCC exam.

The problems aren’t limited to Pearson. Indiana’s 2015 ISTEP test, which was haunted by glitches and scoring delays was administered by California-based CTB, a Pearson competitor. CTB also ran into problems in 2013 when about 78,000 Indiana students taking the test on computers were interrupted over the course of several days — an error that forced CTB to pay $13 million in damages to the state.

Indiana then dumped CTB and hired Pearson last year with a more than $30 million contract to administer the 2016 and 2017 ISTEP exams, but the state is now looking to create yet another new exam for 2018.

The new exam will surely generate another a sought-after testing contract. So Pearson could be treating the ISTEP as something of an audition, trying to make a good impression in hopes of ongoing work.

“We recognize very much that this is critically important work we are doing,” said Melodie Jurgens, who oversees the general scoring process. “Our scorers are quite passionate, and they care a lot about how students do. They want to get it right because they know it’s important.”

Indiana is one of the first states where Pearson has invited reporters to tour its facilities, though earlier this week Overland said some national news outlets were given tours of the Iowa facility. The company hasn’t used such strategies in the past, he said, but plans to open up tours in other states going forward.

Granting this level of access to reporters isn’t a common move for testing companies, said Bob Schaeffer, spokesman for The National Center for Fair and Open Testing, an organization that acts as a testing watchdog. He said he’d been contacted by another reporter about a similar tour this past week but had never heard of this approach before.

But given the challenges Pearson has faced recently — including the loss of three major testing contracts in Florida, Texas and New York — it’s not necessarily a surprise.

“All the major testing companies have had computer testing failures,” Schaeffer said. “It shows an incredible pattern of technological failure that is more than the isolated glitch that they like to make it seem.”

Since Indiana switched to Pearson this year, things have gone relatively smoothly. The state officially started its second round of 2016 ISTEP tests this week, and few problems have been reported.

But Schaeffer said Indiana has “jumped from the frying pan into the incinerator” by making its test vendor switch.

“It’s a perverse game of musical chairs in which a state might reject a contract with a vendor for doing a bad job and hires a new vendor who has time available because they just got fired from another job,” Schaeffer said.

Indiana's 2018 legislative session

Indiana’s plan to measure high schools with a college prep test is on hold for two years

PHOTO: Alan Petersime

Thanks to last-minute legislative wrangling, it’s unclear what test Indiana high schoolers will take for the next two years to measure what they have learned in school.

Lawmakers were expected to approve a House bill proposing Indiana use a college entrance exam starting in 2019 as yearly testing for high schoolers, at the same time state works to replace its overall testing system, ISTEP. But the start date for using the SAT or ACT was pushed back from 2019 to 2021, meaning it’s unclear how high schoolers will be judged for the next two years.

This is the latest upheaval in testing as the state works to replace ISTEP in favor of the new ILEARN testing system, a response to years of technical glitches and scoring problems. While a company has already proposed drafting exams for measuring the performance of Indiana students, officials now need to come up with a solution for the high school situation. ILEARN exams for grades 3-8 are still set to begin in 2019.

“Our next steps are to work with (the state board) to help inform them as they decide the plan for the next several years,” said Adam Baker, spokesman for the Indiana Department of Education. “We take concerns seriously and we will continue doing all we can to support schools to manage the transition well.”

The delay in switching from the 10th grade ISTEP to college entrance exams for measuring high school students was proposed Wednesday night as lawmakers wrapped up the 2018 legislative session. Rep. Bob Behning, the bill’s author, said the change came out of a desire to align the testing plan with recommendations on high school tests from a state committee charged with rewriting Indiana’s graduation requirements.

It’s just the latest road bump since the legislature voted last year to scrap ISTEP and replace it with ILEARN, a plan that originally included a computer-adaptive test for grades 3-8 and end-of-course exams for high-schoolers in English, algebra and biology. Indiana is required by the federal government to test students each year in English and math, and periodically, in science.

The Indiana Department of Education started carrying out the plan to move to ILEARN over the summer and eventually selected the American Institutes for Research to write the test, a company that helped create the Common-Core affiliated Smarter balanced test. AIR’s proposal said they were prepared to create tests for elementary, middle and high school students.

Then, the “graduation pathways” committee, which includes Behning and Sen. Dennis Kruse, the Senate Education Committee chairman, upended the plan by suggesting the state instead use the SAT or ACT to test high schoolers. The committee said the change would result in a yearly test that has more value to students and is something they can use if they plan to attend college. Under their proposal, the change would have come during the 2021-22 school year.

When lawmakers began the 2018 session, they proposed House Bill 1426, which had a 2019 start. This bill passed out of both chambers and the timeline was unchanged until Wednesday.

In the meantime, the Indiana Department of Education and the Indiana State Board of Education must decide what test high schoolers will take in 2019 and 2020 and how the state as a whole will transition from an Indiana-specific 10th grade ISTEP exam to a college entrance exam.

It’s not clear what approach state education officials will take, but one option is to go forward with AIR’s plan to create high school end-of-course exams. The state will already need a U.S. Government exam, which lawmakers made an option for districts last year, and likely will need one for science because college entrance exams include little to no science content. It could make sense to move ahead with English and math as well, though it will ultimately be up to the state board.

Some educators and national education advocates have raised concerns about whether an exam like the SAT or ACT is appropriate for measuring schools, though 14 states already do.

Jeff Butts, superintendent of Wayne Township, told state board members last week that using the college entrance exams seemed to contradict the state’s focus on students who go straight into the workforce and don’t plan to attend college. And a report from Achieve, a national nonprofit that helps states work on academic standards and tests, cautioned states against using the exams for state accountability because they weren’t designed to measure how well students have mastered state standards.

“The danger in using admissions tests as accountability tests for high school is that many high school teachers will be driven to devote scarce course time to middle school topics, water down the high school content they are supposed to teach in mathematics, or too narrowly focus on a limited range of skills in (English),” the report stated.

House Bill 1426 would also combine Indiana’s four diplomas into a single diploma with four “designations” that mirror current diploma tracks. In addition, it would change rules for getting a graduation waiver and create an “alternate diploma” for students with severe special needs.The bill would also allow the Indiana State Board of Education to consider alternatives to Algebra 2 as a graduation requirement and eliminates the requirement that schools give the Accuplacer remediation test.

It next heads to Gov. Eric Holcomb’s desk to be signed into law.

Keep Out

What’s wrong with auditing all of Colorado’s education programs? Everything, lawmakers said.

Students at DSST: College View Middle School work on a reading assignment during an English Language Development class (Photo By Andy Cross / The Denver Post).

State Rep. Jon Becker pitched the idea as basic good governance. The state auditor’s office examines all sorts of state programs, but it never looks at education, the second largest expenditure in Colorado’s budget and a sector that touches the lives of hundreds of thousands of children. So let the auditor take a good, long look and report back to the legislature on which programs are working and which aren’t.

The State Board of Education hated this idea. So did Democrats. And Republicans. The House Education Committee voted 12-0 this week to reject Becker’s bill, which would have required a systematic review of all educational programs enacted by the legislature and in place for at least six years. Even an amendment that would have put the state board in the driver’s seat couldn’t save it.

As he made his case, Becker, a Republican from Fort Morgan in northeastern Colorado, was careful not to name any specific law he would like to see changed.

“I don’t want people to say, ‘Oh, he’s coming after my ox,’” he told the House Education Committee this week. “I know how this works. And that’s not the intent of this bill. It’s to look at all programs.”

But members of the committee weren’t buying it.

State Rep. Alec Garnett, a Denver Democrat, pressed school board members who testified in favor of the bill to name a law or program they were particularly excited to “shed some light on.” If there’s a law that’s a problem, he asked, wouldn’t it make more sense to drill down just on that law?

They tried to demur.

“I feel like you’re trying to get us to say, we really want you to go after 191 or we really want you to go after charter schools,” said Cathy Kipp, a school board member in the Poudre School District who also serves on the board of the Colorado Association of School Boards. “That’s not what this is about.”

Kipp said committee members seemed to be “scared that if their pet programs get looked at, they’ll be eliminated. Why be scared? Shouldn’t we want these programs to be looked at?”

But proponents’ own testimony seemed to suggest some potential targets, including Senate Bill 191, Colorado’s landmark teacher effectiveness law.

As Carrie Warren-Gully, president of the school boards association, argued for the benefits of an independent evaluation of education programs, she offered up an example: The schedules of administrators who have to evaluate dozens of teachers under the law are more complicated than “a flight plan at DIA,” and districts have to hire additional administrators just to manage evaluations, cutting into the resources available for students, she said.

The debate reflected ongoing tensions between the state and school districts over Colorado’s complex system for evaluating schools and teachers and holding them accountable for student achievement. The systematic review bill was supported by the Colorado Association of School Boards, the Colorado Association of School Executives, and the Colorado Rural Schools Alliance.

Lawmakers repeatedly told school officials that if they have problems with particular parts of existing legislation, they should come to them for help and will surely find allies.

Exasperated school officials responded by pointing to the past failure of legislation that would have tweaked aspects of evaluations or assessments — but the frustration was mutual.

“Just because people don’t agree with one specific approach doesn’t mean people aren’t willing to come to the table,” said committee chair Brittany Pettersen, a Lakewood Democrat.

There were other concerns, including the possibility that this type of expansive evaluation would prove expensive and create yet another bureaucracy.

“When have we ever grown government to shrink it?” asked state Rep. Paul Lundeen, a Monument Republican. “There’s a paradox here.”

And state Rep. James Wilson, a Salida Republican who is also a former teacher and school superintendent, questioned whether the auditor’s office has the expertise to review education programs. He also asked what standard would be applied to evaluate programs that are implemented differently in more than 170 school districts across the state.

“If it’s effective more often than not, will they keep it?” Wilson asked. “If it doesn’t work in a third of them, it’s gone?”

State Board of Education members had similar questions when they decided earlier this year that this bill was a bad idea. Many of Colorado’s education laws don’t have clear measures of success against which their performance can be evaluated.

The READ Act, for example, stresses the importance of every child learning to read well in early elementary school and outlines the steps that schools have to take to measure reading ability and provide interventions to help students who are falling behind their peers.

But how many children need to improve their reading and by how much for the READ Act to be deemed effective or efficient? That’s not outlined in the legislation.

Proponents of the bill said outside evaluators could identify best practices and spread them to other districts, but state board members said they already monitor all of these programs on an ongoing basis and already produce thousands of pages of reports on each of these programs that go to the legislature every year. In short, they say they’re on the case.

“The state board, I can assure you, are very devoted and intent to make sure that we follow, monitor, and watch the progress of any programs that go through our department and make sure they’re enacted in the best way possible within the schools,” board member Jane Goff said.