Friday, March 13, 2015

PA: All About the Tests (And Poverty)

In Pennsylvania, we rate schools with the School Performance Profile (SPP). Now a new research report reveals that the SPP is pretty much just a means of converting test scores into a school rating. This has huge implications for all teachers in PA because our teacher evaluations include the SPP for the school at which we teach.

Research for Action, a Philly-based education research group, just released its new brief, "Pennsylvania'a School Performance Profile: Not the Sum of Its Parts." The short version of its findings are pretty stark and not very encouraging--

90% of the SPP is directly based on test results.

90%.

SPP is our answer to the USED waiver requirement for a test-based school-level student achievement report. It replaces the old Adequate Yearly Progress of NCLB days by supposedly considering student growth instead of simple raw scores. It rates schools on a scale of 0-100, with 70 or above considered "passing."  In addition to being used to rate schools and teachers, SPP's get trotted out any time someone wants to make a political argument about failing schools.

RFA was particularly interested in looking at the degree to which SPP actually reflects poverty level, and their introduction includes this sentence:

Studies both in the United States and internationally have established a consistent, negative link between poverty and student outcomes on standardized tests, and found that this relationship has become stronger in recent years.

Emphasis mine. But let's move on.

SPP is put together from a variety of calculations performed on test scores. Five of the six-- which account for 90% of the score-- "rely entirely on test scores."

Our analysis finds that this reliance on test scores, despite the partial use of growth measures, results in a school rating system that favors more advantaged schools.

Emphasis theirs.


The brief opens with a consideration of the correlation of SPP to poverty. I suggest you go look at the graph for yourself, but I will tell you that you don't need any statistics background at all to see the clear correlation between poverty and a lower SPP.  And as we break down the elements of the SPP, it's easy to see why the correlation is there.

Indicators of Academic Achievement (40%)

Forty percent of the school's SPP comes from a proficiency rating (aka just plain straight on test results) that comes from tested subjects, third grade read, and the SAT/ACT College Ready Benchmark. Whether we're talking third grade reading or high school Keystone exams, "performance declines as poverty increases."*

Out of 2,200 schools sampled, 187 had proficiency ratings higher than 90, and only seven of those had more than 50% economically disadvantaged enrollment. Five of those were Philly magnet schools.

Indicators of Academic Growth aka PVAAS (40%)

PVAAS is our version of a VAM rating, in which we compare actual student performance to the performance of imaginary students in an alternate neutral universe run through a magical formula that corrects for everything in the world except teacher influence. It is junk science.

RFA found that while the correlation with poverty was still there, when it came to PSSAs (our elementary test) it was not quite as strong as the proficiency correlation. For the Keystones, writing and science tests, however, the correlation with poverty is, well, robust. Strong. Undeniable. Among other things, this means that you can blunt the impact of Keystone test results by getting some PSSA test-takers under the same roof. Time to start that 5-9 middle school!!

Closing the Achievement Gap (10%)

This particular measure has a built-in penalty for low-achieving schools (aka  high poverty schools-- see above). Basically, you've got six years to close half the proficiency gap between where you are and 100%. If you have 50% proficiency, you've got six years to hit 75%. If you have 60%, you have six years to hit 80%. The lower you are, the more students you must drag over the test score finish line.

That last 10%, incidentally, is items like graduation rate and attendance rate. Pennsylvania also gives you points for the number of students you can convince to buy the products and services of the College Board, including AP stuff and PSAT. So kudos to the College Board people on superior product placement. Remember kids-- give your money to the College Board. It's the law!

Bottom line-- we have schools in PA being judged directly on test performance, and we have data once again clearly showing that the state could save a ton of money by simply issuing school ratings based on the income level of students.

For those who want to complain, "How dare you say those poor kids can't achieve," I'll add this. We aren't measuring whether poor kids can achieve, learn, accomplish great things, or grow up to be exemplary adults-- there is no disputing that they can do all those things. But we aren't measuring that. We are measuring how well they do on a crappy standardized test, and the fact that poverty correlates with results on that crappy test should be a screaming red siren that the crappy test is not measuring what people claim it measures.

*Correction: I had originally include a mistyping here that reversed the meaning of the study.

2 comments:

  1. Same in NC. But we had 80% test score and 20% growth to find out that SES/poverty = School Grade. http://www.ncpublicschools.org/docs/accountability/reporting/spgexecsumm15.pdf Figure 7 says the same thing.

    ReplyDelete
  2. We had an in-service on poverty today using Eric Jensen's book..Teaching with Poverty in Mind... and found out we as teachers can fix all those nasty poverty effects on kids with better teaching strategies. One more thing we are responsible for, eradication of poverty.....also getting every single kid to be prepared for college so they can all get high paying jobs. I know 2 local college grads working at McDonalds....where are those jobs?

    ReplyDelete