Greater Greater Education

Test scores are not improving for at-risk student groups

DC Public Schools Chancellor Kaya Henderson announced the system's 2014 test scores yesterday, saying "we're continuing on an upward trajectory." However, a closer look at the scores reveals a stagnant or downward trajectory for black, Hispanic, low-income, English language learner, and special education students in the last five years.


Reading scores have declined among at-risk groups since 2009. Graph from DCPS with emphasis by the author.

It's true that reading test scores overall have increased since 2009, and slightly overall since last year. However, it's a different story for many demographic subgroups, including every at-risk subgroup: students receiving free or reduced price lunch (FARMS), black students, Latino students, special education students, and students whose first language is not English (called "English Language Learners"). For those students, scores have declined since 2009 and further since last year.

Math scores are mixed among at-risk subgroups since 2009

While reading scores have declined since 2009 among all at-risk subgroups, math scores look better.

Black and Hispanic students have gained on average since 2009, though white students have gained even more. Lower-income (FARMS) students and special education students gained slightly, while English language learners lost considerable ground.

The achievement gap is widening

The decline among at-risk subgroups, along with gains among white and Asian students, has widened the achievement gap in DC. The every-other-year federal test, NAEP, reports the gap between students eligible and not eligible for free and reduced price lunch.


2013 Department of Education report of 8th grade NAEP test scores with emphasis by the author.

However, this gap is nowhere in the 2014 CAS score reports by the Office for the State Superintendent of Education (OSSE) or by DCPS. The Department said the following about this achievement gap in its most recent report on DC NAEP scores.

In 2013, students who were eligible for free/reduced-price school lunch, an indicator of low family income, had an average score that was 31 points lower than students who were not eligible for free/reduced-price school lunch. This performance gap was not significantly different from that in 1998 (25 points).
What does this mean for reform policies?

Can we draw any conclusions about DCPS's reform efforts from this data?

Scores did increase substantially in reading as well as math from 2007 to 2009, and are still above 2007 levels in all categories. DC Public Schools (DCPS) officials argue that 2007 should be the baseline (and therefore we should consider their reforms a success) because mayoral control of DCPS began in 2007.

However, the IMPACT teacher evaluation system went into effect in 2009. The first round of DCPS school closures was announced in the spring of 2008, and implemented over the next two years, well after students had taken the 2008 CAS test.

Most students taking the CAS tests in the spring of 2007, 2008 or 2009 were still unaffected by the IMPACT system or by school closures.

On the other hand, it may still be too early to judge the effects of any particular reform. Still, we must ask, how long will it take to know for sure?

Is DCPS really "on an upward trajectory"? If DC's education system is slowly growing but not for those groups where public education is most likely to make or break success in life, it is not doing its job.

Ken Archer is CTO of a software firm in Tysons Corner. He commutes to Tysons by bus from his home in Georgetown, where he lives with his wife and son. Ken completed a Masters degree in Philosophy from The Catholic University of America. 

Comments

Add a comment »

The groups that are well-situated to learn: they learn to perform in whatever system they are given. Introduce a metric, they will learn how to deal with it.

by BenK on Aug 1, 2014 11:46 am • linkreport

The endless cycle of educational reform:

1) It's the teachers!
2) It's the parents/adults!
3) It's the students!
4) It's the system!
5) go back to the teachers

When was the last time DCPS was able to teach people to read? 1923? 1913?

by charlie on Aug 1, 2014 12:31 pm • linkreport

Downward trajectory over the last five years (except for the math), but an upward trajectory over the last four years! Or the last seven years! It's almost as if you can draw whatever conclusion you want just by choosing the right base year!

by alexandrian on Aug 1, 2014 12:34 pm • linkreport

+1 alexandrian - statistics is such a fun game.

by JDC on Aug 1, 2014 1:09 pm • linkreport

It is always the parents. Other kids are learning at the school, so that means teaching is taking place. While some kids are just not scholarly material, I believe that the most significant factor, by far, is parental influence/support/attitude. If learning is a priority at home, the kid will learn at school.

by The Truth™ on Aug 1, 2014 1:12 pm • linkreport

As children with motivated parents continue to flee DC public schools towards charter schools I imagine DC public schools are just going to be left with kids with less and less motivated and involved parents. I am surprised all the numbers didnt fall

by Richard B on Aug 1, 2014 1:29 pm • linkreport

What's the grade level for the reading and math scores? I don't think we'll see real changes in the reading scores until the kids who graduate from the pre-K program get to testing age. The vocabulary gap for kids under age 3 in wealthy/low-income households is a huge factor in later academic performance. The pre-K program is designed to address the vocabulary gap.

by EvilStevie on Aug 1, 2014 2:45 pm • linkreport

I think it's actually a fair point that 2010 might be the better comparison year. Almost any major school reform causes a drop in test scores and other outcomes in the first year, and in this case, almost all of the decline in reading scores (and where appropriate, math scores) was in the first year.

One thing that sets this apart from many other ed reform efforts is the fact that it's had 5 years for implementation. Often, new reform efforts only get a year or two due to political and superintendent turnover.

That said, it's fair to question why there haven't been larger increases in the scores of the at-risk subgroups, but using the 5-year timeframe is a bit like saying that GDP/employment has dropped since Obama came to office, even through the drop was all in the first year, and we've seen 4-5 years of growth since then.

by Jacques on Aug 1, 2014 2:55 pm • linkreport

Percent Proficient values are NOT Test Scores and therefore not measures of academic achievement across DCPS or charter LEAs.

The metric "Percent Proficient" favored by the US Dept of Education distorts the whole discussion of academic achievement. It hides the real gains that students are making in DCPS and the various charter LEAs - the metric is just a line in the sand.

All academic gains are NOT captured by the percent Proficient. For example, when students move from a "Below Basic" level of proficiency to high score within "Basic", they have achieved significant academic gain - but it doesn't move the needle on percent Proficient. At the other end of the scale, when students move from "Proficient" to "Advanced", the Percent Proficient remains unchanged. In fact, if at some point all students were Proficient, then the metric would be meaningless because it wouldn't measure what students would be achieving beyond Proficient.

Students who have achieved real gains in Test Scores, are in effect hidden by the use of the Proficiency metric. And by inference, their schools receive no credit for their academic gains.

Its time for OSSE to develop true measures of individual student gains and aggregate them to the school and DCPS levels. The data are there -- the individual gains data are in fact used in the DCPS IMPACT system to rate teachers.

Why can't OSSE and DCPS provide more representative metrics for academic gains being achieved in our schools? We need to know which schools are 'highly effective'.

by Steve on Aug 1, 2014 3:20 pm • linkreport

The low numbers for the groups mentioned in the article are unfortunate but not really surprising. The numbers for whites are very encouraging though. They show that children can be educated in DC schools. I think many white couples are under the impression that while DC is a great place to live, DCPS is not a good place to send their children. There is a lot of self-segregation that goes on as a result. In Ward 4, white students make up an absurdly low percentage of the students at most schools. At Coolidge HS it's 0% and at Roosevelt it's 1%. Parents need to know that white children actually do very well in DC schools. As far as test results are concerned, there is no reason to think that it's necessary to send them to private schools or move to the suburbs.

by Anon20011 on Aug 1, 2014 5:08 pm • linkreport

It would be helpful to know if charters also had the same trajectory. My scan of the OSSE reports, just firs impression is that they also were hitting a wall.

We may have to consider that curriculum is an issue here and that also will take 3-5 years of data. Also all census measurements of poverty over the last 5 years is that it has concentrated and worsened even in city's like DC that did fairly well over during the great recession. We cannot ignore that impact.

by DC Parent on Aug 2, 2014 9:01 am • linkreport

Downward trajectory over the last five years (except for the math), but an upward trajectory over the last four years! Or the last seven years! It's almost as if you can draw whatever conclusion you want just by choosing the right base year!

One of my favorite quotes, variously attributed to Vin Scully, Andrew Lang, and Hans Kuhn:

"Americans use statistics like a drunk uses a lamppost - for support rather than illumination."

by dcd on Aug 3, 2014 9:49 am • linkreport

Steve and Alexandrian are on to something. In fact, I'll go further and say if your goal were to create statistics that allowed you to produce lots of numbers while obscuring the underlying reality, the scoring system used by DC would be a good model. There are well-established statistical measures for describing a population. The methodology DC uses, dividing the population into four buckets and reporting the percentage in each bucket, is not a normal statistical measure. More important, it is not a useful statistical measure. Not only does it offer nothing that traditional measures like average and standard deviation offer, it fails at the basic task that traditional statistical measures were created for: determining if there are meaningful differences between two populations.

I would argue that it was in fact created as a measure to obscure differences between populations. Except in the most extreme cases, it's difficult to tell if one set of DC-CAS results are meaningfully better or worse than another. This is true whether you're comparing two schools or whether you're comparing a school or the system from year to year.

There is a practical impact from the choice of measurement methodology. The single-minded focus on percentage of kids scoring proficient or above -- and the high stakes for the careers of teachers and administrators -- means that teachers are incentivized to put the most effort into kids who are on the cusp of proficient. There's no reason to put effort into kids who are already proficient or higher, nor kids who are so far behind that bringing them up to proficient isn't a realistic goal.

by contrarian on Aug 3, 2014 11:56 pm • linkreport

It's easy to look at the numbers, hard to offer answers.

by Math is easy on Aug 4, 2014 10:11 am • linkreport

Just to add to the statistics conversation, there is a lot of obscuring missing information that is normal in reporting statistical results. The big one is we have no reporting of the number of kids in each group. Based on the results, for the overall to increase slightly, the number of kids who are have reached proficient or above must be larger than those not reaching it in the past. That means there must be more white and Asian kids in the population overall. But is it that the number of white and Asian kids increased, or the number of African-American and Latino kids decreased? That is its own story.

The whole issue about the proficiency cut scores has been debated, but every time you make changes to the test, you should have some information that helps you decide whether the tests are equatable and the cut scores are set for equating. Otherwise, you cannot tell if the way the number of students/percentage of students in each bin increased or decreased in score or just they moved the goalposts. See New York state for the cut score fiasco, where they lowered it so much that lots of kids were "proficient" but didn't really know anything more than they did were they were not. The other problem is that there is always measurement error, so not knowing if the students are clustering around the cut scores or more evenly distributed hides the true health. Anyone who is just passing might be within the margin of error and should be failing. If a large proportion of the "proficient" are just at or above the cut score, I'm a lot more worried than not. On the other hand, if a large proportion of the not proficient are just under the cut score and within the standard errors, maybe they are casualties of measurement error and should be considered proficient.

All said, this stuff is much harder than most people think and everyone should take the data and its reporting with a boulder of salt.

by Karen on Aug 4, 2014 3:25 pm • linkreport

Using test scores to rate school progress is tricky, and examined closely for bias. And tests can just be badly written, too. I'm reminded how magnet schools got accused of not significantly improving their students' standard test scores.
What said article didn't say was that all the students in the magnet program were in the 90-99% percentile on the standardized tests before they were accepted.
That was awhile ago, so I can't pull up the article today as example, but I agree with Karen when she says - take data and its reporting with a boulder of salt.

by asffa on Aug 4, 2014 3:48 pm • linkreport

Add a Comment

Name: (will be displayed on the comments page)

Email: (must be your real address, but will be kept private)

URL: (optional, will be displayed)

Your comment:

By submitting a comment, you agree to abide by our comment policy.
Notify me of followup comments via email. (You can also subscribe without commenting.)
Save my name and email address on this computer so I don't have to enter it next time, and so I don't have to answer the anti-spam map challenge question in the future.

or

Support Us