A South Side English teacher, Chris Webster, speaks out about the changes in the scoring of the English Regents. He believes students are being used as pawns in the reform agenda. Below is his letter to John King.
Dear Commissioner King:
You often tell the story of a teacher who had a positive influence on your life. We all remember a teacher who acknowledged who we are; one who valued our talents and dreams. That is why I became a teacher. I wanted to have an influence on the next generation; I wanted to help mold and guide young adults and show them their value. And, of course, I wanted to express and share my love of literature. Therefore, I have enjoyed spending the last 16 years as an English Language Arts teacher. And so it is with a heavy heart that I watch the New York State Education Department blatantly manipulate data for their own agenda, the true victims being the children they purport to represent.
I will not nor can I speak to politics and what goes on behind the scenes. I am, however, right there on the front line, not in educational theory but in the classroom. My students completed the New York State Comprehensive Exam in English (the “Regents Exam”) this past Tuesday, and I am shocked and appalled by what the State is doing. I am not a statistician but I can speak in real terms of what I’ve noticed.
On the English Regents Exam, a student can score a maximum of 25 points on the multiple choice questions, and a maximum of 10 points on the three written components. These “raw scores” are then converted into a score out of 100. When one looks at the score conversion chart, one looks at the multiple choice responses along the y-axis, and the essay score along the x-axis, finds the convergence point, and within that box there is a converted score on a 100 point scale. With 25 points along the y-axis and 10 points along the x-axis, there are a total of 250 “boxes” within the chart, each with a score out of 100.
The June 2013 scores introduced a disturbing change. In the last three administrations of this exam, June 2012, August 2012, and January 2013, a passing score of 65 or higher was available in 70 of the boxes. This represented 28% of the possible scores. In the most recent administration, June, 2013, this number was reduced. The 70 boxes with passing scores were reduced to 57 boxes. This represents 23% of the possible scores. Looking at it in this manner – the state is using the same chart with the same raw data, but reduced the passing score possibilities by 5%. Another way of looking at the same data is to look at the passing score numbers. The State went from 70 boxes of passing scores to 57. This is a reduction by almost 20%.
These same numbers work when one looks at Mastery (score 85 or higher) rates on the exam. Again, using the June 2012, August 2012, and January 2013, Mastery scores were available in 15 of the boxes, representing 6% of the scores. In the most recent administration, June 2013, this number was reduced. The 15 boxes with Mastery scores were reduced to 12. This represents approximately 5% of the possible scores. Again, the State reduced this by 1%. Nevertheless, look at the same data in another light. The State went from 15 boxes of Mastery scores to 12. This is a reduction of 20%.
There are yet other concerns. Across the board, with all scores, the scores in the corresponding boxes have been reduced. For example, in the June 2012 administration, a student who scored a 17 on the multiple choice and a 7 on the essays earned a grade of 66. In August 2012 the score was 67. In January 2013 the score was 67. In June 2013 the score was 63, a failing grade, despite the raw scores being exactly the same as the 3 previous administrations. A second example: using the last 3 administrations of the test once again, if a student’s raw score was 24 on the multiple choice and an 8 on the written responses, the grade earned was an 85 (considered Mastery level). However, in the June 2013 administration, these same raw scores converted to an 83, a drop of two points, and, more importantly, failure to achieve Mastery.
It is only a matter of time before we see the newspaper headlines saying “Regents Scores Drop Across the State.” The test has NOT become more difficult or easier; it is similar to recent exams. The State has simply made it more difficult to pass. In my opinion, this feels like one more attempt to prove that public education is not working.
In an effort to push your reform agenda, the students are the victims. New York State’s high school students deserve better. If an 11th grade student took the exam last year, statistically speaking, he or she had a 20% higher chance of meeting with success. The “high stakes” testing agenda is shameful.
Christopher A. Webster
South Side High School
Rockville Centre, NY
The ELA Regents exam scoring chart was a political bludgeon this year devised to fail many more students than before, thus "proving" their teachers are "failing" and giving the state and districts the opportunity to rate thousands of them "ineffective" based solely on this bogus grading chart.
This is an outrage and every politician in the state should hear from their constituents about this.
King is a smug arrogant man, and he thinks he can pull his reformy act with impunity, rigging the APPR system and the Regents scoring charts without anybody calling him on his maledictions.
But when politicians hear from outraged students, parents, teachers and administrators about this, there is at least a chance that NYSED Commissioner/rookie teacher John King and his NYSED functionaries can be held accountable for their rigging of the system.