Bumpy runway

Emails reveal months of missteps leading up to Tennessee’s disastrous online testing debut

PHOTO: Chalkbeat Photo Illustration

Tennessee education officials allowed students and teachers to go ahead with a new online testing system that had failed repeatedly in classrooms across the state, according to emails obtained by Chalkbeat.

After local districts spent millions of dollars on new computers, iPads, and upgraded internet service, teachers and students practiced for months taking the tests using MIST, an online testing system run by North Carolina-based test maker Measurement Inc.

They encountered myriad problems: Sometimes, the test questions took three minutes each to load, or wouldn’t load at all. At other times, the test wouldn’t work on iPads. And in some cases, the system even saved the wrong answers.

When students in McMinnville, a town southeast of Nashville, logged on to take their practice tests, they found some questions already filled in — incorrectly — and that they couldn’t change the answers. The unsettling implication: Even if students could take the exam, the scores would not reflect their skills.

“That is a HUGE issue to me,” Warren County High School assistant principal Penny Shockley wrote to Measurement Inc.

Tennessee Education Commissioner Candice McQueen speaks with reporters in February about technical problems with the state's new online assessment.
PHOTO: Grace Tatter
Tennessee Education Commissioner Candice McQueen speaks with reporters in February about technical problems with the state’s new online assessment.

The emails contain numerous alarming reports about practice tests gone awry. They also show that miscommunication between officials with the Tennessee Department of Education and Measurement Inc. made it difficult to fix problems in time for launch.

And they suggest that even as problems continued to emerge as the test date neared, state officials either failed to understand or downplayed the widespread nature of the problems to schools. As a result, district leaders who could have chosen to have students take the test on paper instead moved forward with the online system.

The messages span from October until Feb. 10, two days after the online test’s debut and cancellation hours later. Together, they offer a peek into how Tennessee wound up with a worst-case scenario: countless hours wasted by teachers and students preparing for tests that could not be taken.

October: ‘Frustration … is definitely peaking’

Leaders with the Education Department, local districts and Measurement Inc. all knew that Tennessee’s transition to online tests wouldn’t be easy. So the test maker and the department developed a plan to identify weaknesses: stress tests they called “Break MIST” to tax and troubleshoot the online system.

They all had a lot riding on a smooth rollout. Tennessee was counting on the scores to assess whether students are measuring up to new and more challenging standards, to evaluate teachers, and to decide which schools to close. Districts, even the most cash-strapped, had invested millions of dollars on new technology. And Measurement Inc., a small company headquartered in Durham, was looking to prove that it belonged in the multibillion-dollar testing industry’s top tier.

The first “Break MIST” day on Oct. 1 was a mess — as expected. Students in the eastern part of the state logged on without issue, but the system stumbled as the majority of students started their tests an hour later.

That morning, emails show that Measurement Inc. received 105 calls reporting problems. The company noted particular problems in districts using iPads. Officials from the testing company assured the state that the bugs could be fixed, and the education department passed the message on to the public.

Department officials said nearly 1.5 million practice tests were completed successfully over the course of the fall. But emails show that even on days that weren’t meant to tax the system, problems emerged.

On Oct. 20, students in some districts were taking practice tests when “everything quit,” according to a state official who summarized complaints that local technology coordinators were swapping by email.

“Not very reassuring,” wrote Randy Damewood, the IT coordinator in Coffee County.

“Not good news,” agreed John Payne, director of technology for Kingsport City Schools, who suggested that his own district’s tests were working that day.

“The frustration among teachers and central office staff is definitely peaking,” wrote Eric Brown, a state official.

But there was more frustration to come, much of it behind the scenes at the Education Department.

December to January: Communication falters

Even after Measurement Inc. and department officials worked together to address problems during practice tests, the department still wasn’t confident in the online system. They weren’t sure whether problems were due to local infrastructure or something bigger. Officials planned two more “Break MIST” days in January to find out.

But they didn’t involve Measurement Inc. in the planning, at least according to company officials who wrote to the department to say they learned of those plans only after being copied on an email sent to local superintendents by Education Commissioner Candice McQueen.

That message was one of many in which officials with the state or the testing company expressed frustration about communication in the weeks leading up to the testing period.

One tense exchange dealt with the problems faced by students taking practice tests on iPads. “Will the iPad platform be ready for primetime in the spring?” Assistant Commissioner Nakia Towns asked Measurement Inc. officials on Dec. 3. “I feel like we need to be honest on this one.”

The test maker did not email a response, and Towns raised the issue again a month later and indicated that she was still waiting for an answer. “I had asked the question very directly in December,” she wrote Measurement Inc. on Jan. 6. “We urgently need an update.”

It took five more days, until Jan. 11, for her to get an answer. A reply from a Measurement Inc. testing expert blamed the problem on Apple but suggested the company had a “workaround.”

The next day, 504 students in Dyer County, about 80 miles north of Memphis, attempted to take the exam, many of them using iPads. Not one was able to complete the test because questions took too long to load, according to a report from Measurement Inc.’s call center. (Another half-million tests were completed successfully during January, according to department officials.)

Henry Scherich
Henry Scherich

In an interview this week, McQueen told Chalkbeat that Measurement Inc. never fixed the iPad problem and that state officials called Apple themselves looking for a solution. She was still looking for an answer on Jan. 21, when she tried to speak directly with Measurement Inc. President Harry Scherich.

“She is wondering if there is any way for you to find even 15 minutes today for a call,” McQueen’s chief of staff wrote. “Commissioner will make herself available. We need to speak to someone who would be able to make a decision concerning technology in an effort to get communication to directors of schools today.”

Scherich, who was in Michigan meeting with that state’s education department, initially said he did not have time to speak with McQueen. (Measurement Inc. is one of two companies producing Michigan’s new exam.) Later that day, he agreed to speak.


McQueen said she and her team came to a conclusion the next day: The test wouldn’t work on iPads. They emailed and called districts that had purchased tablets for testing and recommended a switch to paper.

February: A last-minute warning gets too little attention

Even as tensions mounted and glitches piled up, both the department and Measurement Inc. projected confidence about what would happen on Feb. 8, when the test would go live for most Tennessee schools. State officials even invited reporters to Department of Education offices on Feb. 3 to say they were optimistic about the rollout.

But behind the scenes, they were preparing for the worst. McQueen asked the test maker’s call centers to prepare for a major outage, something a Measurement Inc. employee told her was “very unlikely.”

She also emailed districts telling them they should consider switching to paper tests if their students were waiting too long for questions to load. She gave them three days to decide.

Just 15 of Tennessee’s nearly 150 districts took her up on the offer, McQueen told Chalkbeat.

But emails show that the state knew that most districts were having difficulties. When one district’s technology coordinator asked the state for a list of districts ready for the online exam, officials came up short.

“I don’t think I can answer that with any confidence,” the department’s top technology officer wrote.

Five days later, on Monday, Feb. 8, the test officially began. Again, the system handled the first set of test takers but broke down when the rest of the state’s students logged on.

As students stopped being able to connect or saw their tests freeze, emails show that technology directors began frantically contacting each other.

“Has anyone else had MIST drop out on them?” the director from Houston County Schools asked. A chorus of technology directors from other districts replied in the affirmative.

Within hours, Tennessee had ended its foray into online testing. First, McQueen told districts to suspend the exams, then directed them to give up on the online platform altogether.

“We are not confident in the system’s ability to perform consistently,” she wrote in an email to school superintendents that afternoon.

McQueen told Chalkbeat that officials started the day “in good faith,” with an assumption that Measurement Inc. had resolved problems adequately. Scherich told Chalkbeat that he’s still unconvinced that the problems were the company’s fault. He suggested that Tennessee’s decision to cancel testing came too soon.

Either way, the department’s top technology official put it simply when he emailed McQueen on the day of the failure. “It appears that greater procedural and operational rigor could have prevented the network outage,” Cliff Lloyd wrote to McQueen.

The debacle was just what Ravi Gupta, the CEO of a Nashville-based charter school, was worried about when he pressed the state in January for more transparency about the status of the online platform.

“It would be a betrayal of our students’ hard work if adult technical failures stood in the way of their success,” Gupta wrote to McQueen.

In the end, that’s exactly what happened.

Clarification (June 28, 2016): This story has also been revised to clarify the impact of the department’s communications on district testing decisions. It has also been updated to include new information about successful practice tests.

Indiana's 2018 legislative session

Indiana’s plan to measure high schools with a college prep test is on hold for two years

PHOTO: Alan Petersime

Thanks to last-minute legislative wrangling, it’s unclear what test Indiana high schoolers will take for the next two years to measure what they have learned in school.

Lawmakers were expected to approve a House bill proposing Indiana use a college entrance exam starting in 2019 as yearly testing for high schoolers, at the same time state works to replace its overall testing system, ISTEP. But the start date for using the SAT or ACT was pushed back from 2019 to 2021, meaning it’s unclear how high schoolers will be judged for the next two years.

This is the latest upheaval in testing as the state works to replace ISTEP in favor of the new ILEARN testing system, a response to years of technical glitches and scoring problems. While a company has already proposed drafting exams for measuring the performance of Indiana students, officials now need to come up with a solution for the high school situation. ILEARN exams for grades 3-8 are still set to begin in 2019.

“Our next steps are to work with (the state board) to help inform them as they decide the plan for the next several years,” said Adam Baker, spokesman for the Indiana Department of Education. “We take concerns seriously and we will continue doing all we can to support schools to manage the transition well.”

The delay in switching from the 10th grade ISTEP to college entrance exams for measuring high school students was proposed Wednesday night as lawmakers wrapped up the 2018 legislative session. Rep. Bob Behning, the bill’s author, said the change came out of a desire to align the testing plan with recommendations on high school tests from a state committee charged with rewriting Indiana’s graduation requirements.

It’s just the latest road bump since the legislature voted last year to scrap ISTEP and replace it with ILEARN, a plan that originally included a computer-adaptive test for grades 3-8 and end-of-course exams for high-schoolers in English, algebra and biology. Indiana is required by the federal government to test students each year in English and math, and periodically, in science.

The Indiana Department of Education started carrying out the plan to move to ILEARN over the summer and eventually selected the American Institutes for Research to write the test, a company that helped create the Common-Core affiliated Smarter balanced test. AIR’s proposal said they were prepared to create tests for elementary, middle and high school students.

Then, the “graduation pathways” committee, which includes Behning and Sen. Dennis Kruse, the Senate Education Committee chairman, upended the plan by suggesting the state instead use the SAT or ACT to test high schoolers. The committee said the change would result in a yearly test that has more value to students and is something they can use if they plan to attend college. Under their proposal, the change would have come during the 2021-22 school year.

When lawmakers began the 2018 session, they proposed House Bill 1426, which had a 2019 start. This bill passed out of both chambers and the timeline was unchanged until Wednesday.

In the meantime, the Indiana Department of Education and the Indiana State Board of Education must decide what test high schoolers will take in 2019 and 2020 and how the state as a whole will transition from an Indiana-specific 10th grade ISTEP exam to a college entrance exam.

It’s not clear what approach state education officials will take, but one option is to go forward with AIR’s plan to create high school end-of-course exams. The state will already need a U.S. Government exam, which lawmakers made an option for districts last year, and likely will need one for science because college entrance exams include little to no science content. It could make sense to move ahead with English and math as well, though it will ultimately be up to the state board.

Some educators and national education advocates have raised concerns about whether an exam like the SAT or ACT is appropriate for measuring schools, though 14 states already do.

Jeff Butts, superintendent of Wayne Township, told state board members last week that using the college entrance exams seemed to contradict the state’s focus on students who go straight into the workforce and don’t plan to attend college. And a report from Achieve, a national nonprofit that helps states work on academic standards and tests, cautioned states against using the exams for state accountability because they weren’t designed to measure how well students have mastered state standards.

“The danger in using admissions tests as accountability tests for high school is that many high school teachers will be driven to devote scarce course time to middle school topics, water down the high school content they are supposed to teach in mathematics, or too narrowly focus on a limited range of skills in (English),” the report stated.

House Bill 1426 would also combine Indiana’s four diplomas into a single diploma with four “designations” that mirror current diploma tracks. In addition, it would change rules for getting a graduation waiver and create an “alternate diploma” for students with severe special needs.The bill would also allow the Indiana State Board of Education to consider alternatives to Algebra 2 as a graduation requirement and eliminates the requirement that schools give the Accuplacer remediation test.

It next heads to Gov. Eric Holcomb’s desk to be signed into law.

Keep Out

What’s wrong with auditing all of Colorado’s education programs? Everything, lawmakers said.

Students at DSST: College View Middle School work on a reading assignment during an English Language Development class (Photo By Andy Cross / The Denver Post).

State Rep. Jon Becker pitched the idea as basic good governance. The state auditor’s office examines all sorts of state programs, but it never looks at education, the second largest expenditure in Colorado’s budget and a sector that touches the lives of hundreds of thousands of children. So let the auditor take a good, long look and report back to the legislature on which programs are working and which aren’t.

The State Board of Education hated this idea. So did Democrats. And Republicans. The House Education Committee voted 12-0 this week to reject Becker’s bill, which would have required a systematic review of all educational programs enacted by the legislature and in place for at least six years. Even an amendment that would have put the state board in the driver’s seat couldn’t save it.

As he made his case, Becker, a Republican from Fort Morgan in northeastern Colorado, was careful not to name any specific law he would like to see changed.

“I don’t want people to say, ‘Oh, he’s coming after my ox,’” he told the House Education Committee this week. “I know how this works. And that’s not the intent of this bill. It’s to look at all programs.”

But members of the committee weren’t buying it.

State Rep. Alec Garnett, a Denver Democrat, pressed school board members who testified in favor of the bill to name a law or program they were particularly excited to “shed some light on.” If there’s a law that’s a problem, he asked, wouldn’t it make more sense to drill down just on that law?

They tried to demur.

“I feel like you’re trying to get us to say, we really want you to go after 191 or we really want you to go after charter schools,” said Cathy Kipp, a school board member in the Poudre School District who also serves on the board of the Colorado Association of School Boards. “That’s not what this is about.”

Kipp said committee members seemed to be “scared that if their pet programs get looked at, they’ll be eliminated. Why be scared? Shouldn’t we want these programs to be looked at?”

But proponents’ own testimony seemed to suggest some potential targets, including Senate Bill 191, Colorado’s landmark teacher effectiveness law.

As Carrie Warren-Gully, president of the school boards association, argued for the benefits of an independent evaluation of education programs, she offered up an example: The schedules of administrators who have to evaluate dozens of teachers under the law are more complicated than “a flight plan at DIA,” and districts have to hire additional administrators just to manage evaluations, cutting into the resources available for students, she said.

The debate reflected ongoing tensions between the state and school districts over Colorado’s complex system for evaluating schools and teachers and holding them accountable for student achievement. The systematic review bill was supported by the Colorado Association of School Boards, the Colorado Association of School Executives, and the Colorado Rural Schools Alliance.

Lawmakers repeatedly told school officials that if they have problems with particular parts of existing legislation, they should come to them for help and will surely find allies.

Exasperated school officials responded by pointing to the past failure of legislation that would have tweaked aspects of evaluations or assessments — but the frustration was mutual.

“Just because people don’t agree with one specific approach doesn’t mean people aren’t willing to come to the table,” said committee chair Brittany Pettersen, a Lakewood Democrat.

There were other concerns, including the possibility that this type of expansive evaluation would prove expensive and create yet another bureaucracy.

“When have we ever grown government to shrink it?” asked state Rep. Paul Lundeen, a Monument Republican. “There’s a paradox here.”

And state Rep. James Wilson, a Salida Republican who is also a former teacher and school superintendent, questioned whether the auditor’s office has the expertise to review education programs. He also asked what standard would be applied to evaluate programs that are implemented differently in more than 170 school districts across the state.

“If it’s effective more often than not, will they keep it?” Wilson asked. “If it doesn’t work in a third of them, it’s gone?”

State Board of Education members had similar questions when they decided earlier this year that this bill was a bad idea. Many of Colorado’s education laws don’t have clear measures of success against which their performance can be evaluated.

The READ Act, for example, stresses the importance of every child learning to read well in early elementary school and outlines the steps that schools have to take to measure reading ability and provide interventions to help students who are falling behind their peers.

But how many children need to improve their reading and by how much for the READ Act to be deemed effective or efficient? That’s not outlined in the legislation.

Proponents of the bill said outside evaluators could identify best practices and spread them to other districts, but state board members said they already monitor all of these programs on an ongoing basis and already produce thousands of pages of reports on each of these programs that go to the legislature every year. In short, they say they’re on the case.

“The state board, I can assure you, are very devoted and intent to make sure that we follow, monitor, and watch the progress of any programs that go through our department and make sure they’re enacted in the best way possible within the schools,” board member Jane Goff said.