Are demographic changes behind test score improvements?

DC Mayor Vincent Gray announced record increases in test scores last month, attributing the gains to his education reform policies. But could demographic changes in DC be responsible for the increases? The answer is: we don't know.

Photo by verbeeldingskr8 on Flickr.

Mayor Gray and DCPS Chancellor Kaya Henderson have claimed that the increases validate their education reform policies and show we must "stay the course", a subtle jab at Councilmember David Catania's "reform 2.0" proposals.

But is it possible that the test score increases reflect the growth of middle and upper class families in DC, and not increased school quality?

What does census data say?

DCPS points to the improvements in test scores each year since the 2007 mayoral takeover of DC Public Schools.

DCPS test scores have consistently gone up

But look at the changes in demographics among DC families over a similar time period.

Income of DC Families. Source: Census American Community Surveys

The median income among families in DC has consistently climbed, from $51,411 in 2005 to $75,603 in 2011, according to the census' annual American Community Survey (ACS). And students from higher-income families tend to do better on standardized tests.

Some important caveats should be made. First, we don't know whether this demographic shift is reflected in the public school population. Second, the ACS data that is available only goes through 2011, with 2012 data scheduled to be released in September.

DCPS spokesperson Melissa Salmanowitz pointed me to the growth in scores since 2007 for students who receive free or reduced-price lunch, saying these gains had "disproven" the thesis that demographics are behind the overall score gains. Students eligible for free or reduced-price lunch come from families earning under 185% of the poverty line.

It's true that scores for those students have gone up. But most or all of that gain happened before many of the reforms Gray and Henderson claim are responsible for the increase took effect. After 2009, the scores for students who receive free and reduced price lunches plateaued.

FARM scores increased before school reform

And this spike in scores from 2007 to 2009 wasn't due to cheating. Scores on the federal NAEP test from students receiving free and reduced-price lunch also spiked from 2007 to 2009 and then leveled off or declined in 2011. The NAEP test is allegedly "uncheatable."

What does this mean?

Does this mean that demographics, and not school quality initiatives like Common Core, charter expansion, teacher assessments and extended school day, are responsible for the increase in test scores? I don't believe we can draw that conclusion.

What this means is that we don't know what is causing the test score gains. We can't tell what is causing the increase in test scores because we are using static proficiency measures of test scores that have been roundly criticized.

Static measures of growth compare different cohorts of students from year to year. The problem is that when the demographic composition of students in a school changes, test results may go up or down because of that change rather than a change in the quality of instruction.

DC Public Schools and Mayor Gray are currently held to a bar that has been set for them by others, namely OSSE and the federal No Child Left Behind law, and not themselves. And they should be congratulated for moving these static test scores in the right direction.

What needs changing, as the National Academy of Sciences argues, is the bar itself. We need to start holding schools accountable using metrics of growth, such as how many grade levels a school advances its students each year. Isn't that what matters?

Chancellor Henderson's response to the suggestion that increased test scores might be the result of an influx of wealthier students was: "Haters are going to hate." But is it hate to try to follow the data wherever it leads? I don't think so. I thought that's what school reform was all about.

Ken Archer is CTO of a software firm in Tysons Corner. He commutes to Tysons by bus from his home in Georgetown, where he lives with his wife and son. Ken completed a Masters degree in Philosophy from The Catholic University of America. 


Add a comment »

I don't think it's quite right to say that scores for Free and Reduced Meals students plateaued after 2009. Rather, like the scores for most other cohorts, they dropped, and later rose.

There are a number of ways to interpret this: one is that the Rhee/Henderson reforms led to a lot of upheaval that hit DCCAS results hard in the first couple of years, then began to gain their stride in the last two years.

This interpretation does not touch on whether the current model is the best approach, but regardless of whether or not it is, one could argue that stay-the-course is the best decision. Creating another new regime to lurch the system in a different direction would likely create another drop (followed by a later rise, but only if the new system is given enough time for implementation).

The discontinuity of the pre-Rhee era in DC showed its effects for a decade, as each new superintendent brought in their own plan, and few were able to stick around for more than a year or so to see it through. In fact, it is reasonable to argue that the improvements in 2008 and 2009 are due in large part to Rhee's immediate predecessor, Clifford Janey, being the only superintendent in a decade to be in place for more than 2 years.

by Jacques on Aug 21, 2013 11:46 am • linkreport

Actually the scores didn't change much for several years and then bumped up this past year. No real statistical trend over time there. The recent change could be a trend; it also could be statistical noise. The problem here is only comparing 2007 and 2013 without considering what happened in between (nothing much).

by Rich on Aug 21, 2013 12:45 pm • linkreport

I have a hunch that the "advanced" group will track gentrification. It would be really good to see the full grade, school and subgroup breakdowns of advanced students, particularly in time series 2007 to date. I think there was a problem in the OSSE data that was released that numbers of subgroups less than 25 would not be released. Advanced students seem always to be less than 25 except perhaps across entire schools, so the data would not be publicly reported.

Is there any way to get the "under 25" data so that it can be charted and studied? If OSSE or DCPS won't release it, would they perform analyses and release charts or summaries that reflect percentages or rates of growth, etc., without revealing the underlying numbers?

If you could show that numbers of advanced students grew without demographic change, I would be impressed. But my hunch is that it is not the case.

by andy on Aug 22, 2013 11:29 am • linkreport

This year's test is aligned with Common Core and thus is a "new test". A new test needs to be baselined to the old test and new benchmarks for proficient / advanced, etc. need to be established.

It's easy to fudge the results and show "improvement" with minor adjustments in the baseline process. Small tweaks when setting the baseline to the last year's test can create SUCCESS or FAILURE headlines (without students having learned any more or less).

A new test "improvement" is in line with scores going up broadly across the district. There doesn't even need to be a claim of "gaming the test" or "bad faith". However, DCPS and OSSE are united in wanting to see results from all the changes they've put in place and "racing to the top". Two years out things should stabilize and the test results should start to be accurate to the actual learning trend.

BTW, NY state also introduced new common core aligned tests and their students "failed miserably" when the new benchmarks on a the norm based test were set at unrealistically high levels to be 'proficient'. Note that the NY test is not a CRITERION based test, which tests specific things the student should have learned, but instead is like the SAT, which identifies how you compare to OTHERS, not how you compare to a body of knowledge. Advanced when compared to others means you know stuff that wasn't even in the curriculum for the grade and wasn't taught. Proficient in a criterion referenced test means you know some percentage of what is expected (mandated) to be learned by the end of the grade.

Some scores in some schools are probably based on demographic changes (Maury perhaps), but it would be hard to tease this out of the data, except by viewing the White enrollment over time. One outlier in the performance for the White subgroup is Murch 3rd grade where proficient and advanced was lowest in city for white subgroup (at 77%).

by graduated. on Aug 26, 2013 3:07 pm • linkreport

@graduated. -- I've been wondering why the "Common Core aligned" test scores went up in DC and dramatically down in NY and also in KY. Specifically, I've been wondering whether that means students in DC are really better prepared for the advent of the PARCC tests in a couple of years. If I understand you correctly, you're saying it has more to do with where different states set the bar for the various categories (Proficient, etc.) and whether they were testing knowledge in absolute terms or relative to other students. In other words, we still really don't know how DC students match up against students in other jurisdictions.

by Natalie on Aug 26, 2013 3:30 pm • linkreport

@natalie, -- The only way to really see how different populations score on high stakes tests is to give them the same test. The NAEP is an accurate test of capability that is given nationwide at 4th and 8th grades to a sample of all students. The scores are broken out by racial classification (when sufficient numbers are in the tested group).

However, for subgroups (e.g. White / 8th grade) the NAEP numbers have not been reported for DC because there have been too few students in the tested grade subgroup.

How this ties to "doing in school" and learning the necessary skills to advance to the next level... well that's a tough and a different question. Valerie Strauss goes writes about the horrific issues with NY State's incredibly bad Common Core test "results". (
The NY story is that the new baseline failed everybody across the board and did it so horribly and publicly that there has been backlash. The superintendent has effectively promised to reset the benchmark and guarantees that scores will go up NEXT YEAR.

How can you tell how DC schools are doing? Ignore the test scores until there has been three years without changes.

If you have a child in the system (and you are attentive enough to be reading GWW) you will know if your child is learning (and if your teacher is teaching). You will chose a school with sufficient high SES/White population to be confident that there is a built-in advocacy group for quality instruction at the school. You will choose a school by its demographics and then advocate for your kid and others at the school.

The problem is that there are not enough schools in DC with the demographic mix that leads to high(er) performing schools and establishing them takes 1) a long time and 2) commitment. There is a first mover problem; if a large enough cohort of high SES/high achieving demographic characteristics enroll at the same time you may hit the tipping point to create an "good" school, but if you don't hit the tipping point your child will suffer.

by graduated. on Aug 27, 2013 2:33 pm • linkreport

@natalie -- Also NAEP has an "urban schools" group, which is the appropriate measure for how DC schools are doing against similar school populations. Comparing DC schools against "state" scores is ridiculous and won't help. DC doesn't have *any* schools with "suburban" demographics.

by graduated. on Aug 27, 2013 2:39 pm • linkreport

DC public schools are still hemorrhaging black boys in to the criminal justice system in numbers that pale those of white students. One would expect black officials not to ignore this waste of talent and tax resources.

by martin on Jul 31, 2014 9:47 am • linkreport

Add a Comment

Name: (will be displayed on the comments page)

Email: (must be your real address, but will be kept private)

URL: (optional, will be displayed)

You can use some HTML, like <blockquote>quoting another comment</blockquote>, <i>italics</i>, and <a href="http://url_here">hyperlinks</a>. More here.

Your comment:

By submitting a comment, you agree to abide by our comment policy.
Notify me of followup comments via email. (You can also subscribe without commenting.)
Save my name and email address on this computer so I don't have to enter it next time, and so I don't have to answer the anti-spam map challenge question in the future.