Perdido 03

Perdido 03

Monday, August 25, 2014

NYSED Looks To Make 40% of APPR Evaluations Based On State Tests

The APPR teacher evaluation system in New York is split so that 60% is based on "subjective" measures like classroom observations and (in some districts) student surveys while the other 40% is based on "student performance."

The "student performance" APPR component is divided into 20% that is based on state tests and 20% that is based on "local assessments."

There has been much criticism of APPR from educators and parents because of the reliance on so much testing to rate teachers.

Some of the "overtesting" criticism comes because the "local assessments" part of the "student performance" component can force schools to use pre-tests and post-tests for every student in every subject in order to rate the teachers.

These pre-tests and post-tests (sometimes called "performance assessments" or some other edu-jargon) come on top of the state tests that children have to take, though unlike the state tests, the "local assessments" don't actually count for the students - they just count for the teachers.

All this testing has weighed down the system, forced teachers and students to spend many days on either test prep or testing and now even the New York State Education Department is hearing the criticism and responding.

Unfortunately, their response is going to cause even bigger problems for teachers.

It seems SED wants districts to get rid of as many of the "local assessments" as possible and just rate teachers on the state tests - even if the tests are NOT in the subject that teachers are licensed in:

The New York state Education Department is urging districts to eliminate as much local testing as possible for the purposes of teacher evaluations, and is committing federal money to help make it happen. But some educators are opposing the increased reliance on state exams.


There is no prospect of eliminating Common Core-aligned tests or other evaluations mandated by the state or federal government, and it is those that have drawn the most complaints. Rather, the state sent a report to every district in the state this summer, identifying locally-designed tests used for teacher assessment purposes and presenting alternate ways for districts to assess teachers without testing children more.


The state also urges districts to swap out locally-developed end-of-year tests in their Annual Professional Performance Review (APPR) formulas for state evaluations that need to be done anyway. The overall goal is for districts to eliminate local tests and lean more heavily on the ones mandated by the state.


Sixty percent of APPR is based on a personal evaluation by a teacher's supervisor; it is the other 40 percent, based on data, that the state is hoping to nudge.

In classes without a Regents exam, the recommended changes would entail sacrificing subject-specific tests for broader, semi-related measures.

For instance, the state suggests districts could eliminate end-of-year music tests and instead assess music teachers on how well their students — or students in the school as a whole — did on the state English language arts test.

When APPR was first shoved onto teachers, the UFT pushed backed against criticism from some of their members over the testing component by noting that they had forced the state to break up the 40% "student performance" component into 20%-20%, with half based on state tests and the other half based on "local assessments" that would be created by the districts and subject to local negotiations.

But it seems the state is now moving to try and get as many districts as possible to just use the state tests to rate teachers - even when teachers don't actually teach subjects that have state tests.

According to the Democrat and Chronicle article, the NYSUT opposes the move to using just state tests to rate teachers:

A spokesman for the New York State Union of Teachers said that while the union supports less testing in general, it opposes a greater reliance on state exams.

But you can see what's going to eventually happen here if we're not careful.

The state is moving to have 40% of a teacher's evaluation based upon state tests only.

The "local assessments" part of the "student performance" component was always cumbersome and hard to pull off.

Some districts, according to the D&C piece, did pull it off and are happy with the "local assessments."

Many did not.

Indeed, in Rochester, the union is pushing to get rid of many of the "local assessments" and just use the state tests for ratings:

Rochester early education teachers, though, protested and filed a union grievance in June over unnecessary end-of-year testing for APPR purposes. Several teachers told the school board in June that they hadn't taught an actual lesson for more than a month because their end-of-year assessment demands were so burdensome. They described leaving entire classes of kindergarteners unattended while taking students aside one-by-one and administering a series of evaluations.

The Rochester Teachers Association is currently in negotiations with the district to amend the terms of its APPR agreement and remove as many local exams as possible, according to Adam Urbanski, its president.

"Our goal is to eliminate as much non-mandated testing as possible. To the best as I can discern, RCSD has a similar goal," he said. "Not because we think state tests are superior to local tests, but if you don't have local exams that are a significant improvement over state tests, then why do double testing?"

Local exams in Rochester are not superior to state exams by much, he said, because teachers haven't had sufficient time and resources to create them.

SED looks to be dangling out a little RttT money in order to get districts to agree to use the state tests for 40% of a teacher's evaluations.

Which can be a huge problem if you're a teacher who works in a school where students have not scored highly on the state tests in the past.

That was supposedly why the 40% "student performance" component was divided into 20% state tests and 20% "local assessments" - so that teachers who work in schools where students have scored low on state tests in the past would not be unfairly dinged on the 40% "student performance" component.

This is especially important because the SED and Regents, pushed by Cuomo, made it such that if a teacher is rated "ineffective" on the 40% "student performance" component, that teacher has to be rated "ineffective" overall no matter how they scored on the 60% subjective component.

With the state pushing to get districts to just use the state tests to rate teachers on the 40% "student performance" component, you're going to see an increase in teachers getting rated "ineffective" overall.

Cynics out there - including me - have thought that APPR was always devised to rate as many teachers as possible "ineffective" so that districts could fire them if they wanted (two straight "ineffective" ratings and a teacher can be subject to firing.)

The first batch of APPR ratings didn't bring a high number of "ineffective" ratings across the state, but in certain districts - like Rochester and Buffalo - where state tests were used two different ways to rate teachers, there was a high percentage of teachers rated "ineffective."

Don't think that isn't one of the strategies behind the state's move to get districts to use state tests for the whole 40% of the "student performance" component.

NYSED says they want to "reduce" testing and defend themselves from parent and educator complaints about "overtesting" and maybe even save districts a little money by killing some of the "local assessments."

And maybe that's so.

But a side benefit to all of this is that NYSED Commissioner John King, Regents Chancellor Merryl Tisch and their merry men and women in reform in Albany will get what they wanted all along when it came to teacher evaluations - 40% of APPR based on just the state tests.

And it looks like at least some of the unions in the state - including the local in Rochester - are going to help them.

I get why people want to eliminate as many of the "local assessments" as possible - as I wrote above, they're cumbersome and they require a lot of time and effort.

But be careful of what may be the consequences of that assessment purge - 40% of a teacher's rating based solely on a state test (albeit, with the numbers crunched two different ways.)


  1. So, they want me to be rated on 100% of a test that 70% of the students are not passing?

    Sounds like a setup, which is not surprising. Since NCLB and Race to the Top, everything the reformers do is a setup for failure, profits, and budget-cutting.

    Accept this setup by NYSED or turn away and be punched in the face by your union leaders, then turned back around.

  2. Pogue, you've hit on the best crystallization of this:

    "So, they want me to be rated on 100% of a test that 70% of the students are not passing?"

    That really puts it in the right perspective.

  3. How about art, music, and PE teachers? Do you think they want to be rated on 100% of a test that 100% of their kids take for a subject that 100% of these teachers DON'T EVEN TEACH?

    1. In some ways, it's the best thing that can happen - it brings the absurdity level to a crescendo and exposes APPR as the sham it is.

      Unfortunately people will be hurt in the exposure process, however.

  4. They just want to sell more tests and more testing equipment.
    They want to make all the profit they can, while they can before this whole sham blows up all around them.

    1. I think SED and the Regents are scrambling to save APPR by trying to make it look like the Endless Testing regime is ending. Fewer tests might make parents happy, keep them from yelling at King. But of course it's not about the amount of testing but the stakes on the tests that cause so many of the problems. Making state tests count for 40% of a teacher's evaluation isn't going to help curtail test prep, that's for sure.

  5. As a NYC HS teacher teaching a regents class my 40% is already based on my students' regents. State score on some made up growth (20%) and then my local is based on some other factors(??) school wide of the same regents (20%). Since my students did better, I will be scored lowered because of students I have never taught. I’m not even judged on all my classes/students, but the course i have more students in.

  6. Silly me, I thought the headline said NYSUT not NYSED. Hey wait a minute....