high stakes

There’s always been confusion surrounding Tennessee’s growth model. With a missing year of data, new questions pile on

PHOTO: Laura Faith Kebede

At a time when scores are about to be used for high-stakes decisions in how to improve Tennessee’s schools, gaps in the state’s data and uncertainty about how scores were derived have left Memphis officials wondering how to interpret the torrent of information.

Last year’s chaotic state testing, which led to the cancellation of the state’s test for grades 3 to 8, left a crucial gap in the data meant to help make decisions about schools and teachers.

School leaders have also said they were puzzled by the state’s methodology in reaching the so-called growth scores upon which districts and schools are judged — particularly by how they arrived at the Memphis district’s low score.

Even those who are paid to sift through the data say they are having trouble getting answers to questions about the growth scores, known as TVAAS. Bill White, chief of planning and accountability for Shelby County Schools, conceded to board members last week that he didn’t know the ins and outs of the complex formula and the changes meant to compensate for the missing data.

“I have personally never been shown all the mathematics behind our data and how this works,” he told board members. “I do know that it has been peer-reviewed and vetted and it’s essentially been held up among those statisticians. But there is a lot that goes on behind the scenes that no one has been able to walk us through.”

The confusion has renewed skepticism about the state’s value-added model, which is supposed to help officials identify the impact that schools and teachers have on student performance. The system relies on the state’s data measuring student growth in districts.

Part of the problem is last year’s botched testing, which is having multiple ripple effects throughout the state.

This year, growth scores are comparing 2016-17 test results with the 2014-15 school year, the most recent data available. That throws a wrench in how to assess which school or teacher is responsible for a child’s growth over a two-year period. And for elementary schools, that means there is no data for fourth graders this year since testing in third grade, the first year students take state tests, was canceled.

In addition, one subject was dropped entirely from TVAAS calculations because social studies questions were a trial run for elementary and middle schools students and did not count.

Statisticians for the most part have figured out how to calculate growth even when a state transitions to a new test. But the missing data creates a whole other host of challenges the revisions attempt to account for.

One Memphis charter leader said he still isn’t quite sure how his school even got a score since last year his highest grade level at the school was third grade, the first year of testing.

“It’s such a convoluted formula, it’s hard for us to understand. We’re not sure how we got (our score),” said the charter leader, who declined to be named because he was still seeking answers from the state.

Damian Betebenner, a senior associate at Center for Assessment that regularly consults with state departments, said missing data on top of a testing transition “muddies the water” on results.

“When you look at growth over two years, so how much the student grew from third to fifth grade, then it’s probably going to be a meaningful quantity,” he said. “But to then assert that it isolates the school contribution becomes a pretty tenuous assertion… It adds another thing that’s changing underneath the scene.”

At the same time, TVAAS scores for struggling schools will be a significant factor to determine which improvement tracks they will be be placed on under the state’s new accountability system as outlined in its plan to comply with the federal Every Student Succeeds Act. For some schools, their TVAAS score will be the difference between continuing under a local intervention model or being eligible to enter the state-run Achievement School District. The school growth scores will also determine which charter schools are eligible for a new pot of state money for facilities.

The state has data analysts based across Tennessee to help districts with their questions and provide data simulations for the complex formula that has been replicated in other states.

“Of course, the reason it is complex is because we want it to be fair for educators and therefore capture as much data and nuance as possible – which is discussed at length in the technical documentation,” said a state department spokeswoman.

The state has also published an overview video of how the formula works and details on the recent changes in a 46-page, formula-packed document from SAS, the private company that calculates teacher and school scores for the state.

But as far as knowing how the state gets from A to Z, White said he still has questions.

“I’ve had some questions about getting access to certain data myself,” said White, who routinely interprets data for the district. “We would like a lot more access to what goes into TVAAS.” (He later declined to elaborate.)

He’s not the only one. When the Tennessee Education Association unsuccessfully sued Knox County Schools over its use of TVAAS in awarding teacher bonuses, access to data on how the scores were calculated was central to the association’s argument that the district denied teachers due process, said Rick Colbert, TEA’s general counsel.

When Colbert attempted to subpoena technical documents on the calculations, SAS blocked it partially because the request would divulge “trade secrets.”

“When they’re called upon to defend it you get a lot of general statements but you can’t get a lot of information to see if you can back that up,” Colbert said. “There’s so much about TVAAS that can’t be explained.”

Board member Mike Kernell called it a double standard and asked White last week if the district could request a demonstration of the complicated formula.

“I think the state department of education ought to show its work if they’re asking children to show their work,” he said.

Detroit Story Booth

Why one woman thinks special education reform can’t happen in isolation

PHOTO: Colin Maloney
Sharon Kelso, student advocate from Detroit

When Sharon Kelso’s kids and grandkids were still in school, they’d come home and hear the same question from her almost every day: “How was your day in school?” One day, a little over a decade ago, Kelso’s grandson gave a troubling answer. He felt violated when security guards at his school conducted a mass search of students’ personal belongings.

Kelso, a Cass Tech grad, felt compelled to act. Eventually, she became the plaintiff in two cases which outlawed unreasonable mass searches of students in Detroit’s main district.

Fast forward to August, when her three great-nephews lost both their mother and father in the space of a week and Kelso became their guardian. Today, she asks them the same question she has asked two generations of Detroit students: “How was your day in school?”

The answers she receives still deeply inform her advocacy work.

Watch the full video here:

– Colin Maloney

First Person

Why the phrase ‘with fidelity’ is an affront to good teaching

PHOTO: Alan Petersime

“With fidelity” are some of the most damaging words in education.

Districts spend a ton of money paying people to pick out massively expensive, packaged curriculums, as if every one of a thousand classrooms needs the exact same things. Then officials say, over and over again, that they must be implemented “with fidelity.” What they mean is that teachers better not do anything that would serve their students’ specific needs.

When that curriculum does nothing to increase student achievement, it is not blamed. The district person who found it and purchased it is never blamed. Nope. They say, “Well, the teachers must not have been implementing it with fidelity.”

It keeps happening because admitting that schools are messy and students are human and teaching is both creative and artistic would also mean you have to trust teachers and let them have some power. Also, there are some really crappy teachers out there, and programs for everyone are often meant to push that worst-case-scenario line a little higher.

And if everyone’s doing just what they’re supposed to, we’ll get such good, clean numbers, and isn’t that worth a few thousand more dollars?

I was talking with a friend recently, a teacher at an urban school on the East Coast. He had been called to task by his principal for splitting his kids into groups to offer differentiated math instruction based on students’ needs. “But,” the principal said, “did the pacing guide say to differentiate? You need to trust the system.”

I understand the desire to find out if a curriculum “works.” But I don’t trust anyone who can say “trust the system” without vomiting. Not when the system is so much worse than anything teachers would put together.

Last year, my old district implemented Reading Plus, an online reading program that forces students to read at a pace determined by their scores. The trainers promised, literally promised us, that there wasn’t a single reading selection anywhere in the program that could be considered offensive to anyone. God knows I never learned anything from a book that made me feel uncomfortable!

Oh, and students were supposed to use this program — forced-paced reading of benign material followed by multiple-choice questions and more forced-pace reading — for 90 minutes a week. We heard a lot about fidelity when the program did almost nothing for students (and, I believe quite strongly, did far worse than encouraging independent reading of high-interest books for 90 minutes a week would have done).

At the end of that year, I was handed copies of next year’s great adventure in fidelity. I’m not in that district any longer, but the whole district was all switching over to SpringBoard, another curriculum, in language arts classes. On came the emails about implementing with fidelity and getting everyone on the same page. We were promised flexibility, you know, so long as we also stuck to the pacing guide of the workbook.

I gave it a look, I did, because only idiots turn down potential tools. But man, it seemed custom-built to keep thinking — especially any creative, critical thought from either students or teachers — to a bare minimum.

I just got an email from two students from last year. They said hi, told me they missed creative writing class, and said they hated SpringBoard, the “evil twin of Reading Plus.”

That district ran out of money and had to cut teachers (including me) at the end of the year. But if they hadn’t, I don’t think I would have lasted long if forced to teach from a pacing guide. I’m a good teacher. Good teachers love to be challenged and supported. They take feedback well, but man do we hate mandates for stuff we know isn’t best for the kids in our room.

Because, from inside a classroom full of dynamic, chaotic brilliance;

from a classroom where that kid just shared that thing that broke all of our hearts;

from a classroom where that other kid figured out that idea they’ve been working on for weeks;

from that classroom where that other kid, who doesn’t know enough of the language, hides how hard he works to keep up and still misses things;

and from that classroom where one kid isn’t sure if they trust you yet, and that other kid trusts you too much, too easily, because their bar had been set too low after years of teachers that didn’t care enough;

from inside that classroom, it’s impossible to trust that anyone else has a better idea than I do about what my students need to do for our next 50 minutes.

Tom Rademacher is a teacher living in Minneapolis who was named Minnesota’s Teacher of the Year in 2014. His book, “It Won’t Be Easy: An Exceedingly Honest (and Slightly Unprofessional) Love Letter to Teaching,” was published in April. He can be found on Twitter @mrtomrad and writes on misterrad.tumblr.com, where this post first appeared.