Greater Greater Education

Do DC's standardized writing scores mean anything?

There's been a lot of talk about the most recent DC CAS reading and math scores and what they mean. But another set of test scores, assessing students' writing skills, hasn't gotten much attention. What do they mean, if anything?


Photo by Juan Pablo Mejia on Flickr.

The DC CAS has included a composition section since 2011, but the 2012-13 school year is the first time the scores have been factored into a school's rating for purposes of federal law. (The scores still don't count for DCPS teacher evaluations.) Like the reading and math scores, writing scores went up this year. But measuring writing proficiency is even trickier than assessing skills in reading and math.

All DCPS and DC charter school students in grades 4, 7, and 10 took DC's standardized writing test last spring. The overall proficiency rate for DCPS students was just under 50%. For charter school students the rate was slightly higher, just over 52%. In both cases that represents an increase over prior years.

Although the one-year increases were fairly modest, the gains from 2011 were dramatic for both sectors: DCPS writing scores increased by 16.9 points, and charter scores by 19.5 points.

Does that mean that DC students are now aces at writing? Based on what I know personally and what I've heard from teachers and parents, I doubt it. And the data show that more DCPS students fall into the "below basic" category in writing (20%) than in either reading (17%) or math (18%).

On the other hand, there are also more students in the "advanced" category: 21%, as compared to only 11% in reading and 16% in math. So the writing results are more polarized than other categories: there are more really good writers, as measured by the test, but also more really bad ones.

In fact, in DCPS schools there are fewer students scoring at the "proficient" level in writing than in reading and math. Only 29% of students scored proficient in writing, whereas the figure for reading was 36% and for math 33%. Because the proficiency rate is a combination of "proficient" and "advanced," it's really the relatively high number of students in the advanced category who are pulling the writing proficiency rate up to 50%.

Breakdown of DCPS scores

DCPS has released a breakdown of scores by ward, subgroup, and school, so it's possible to try to draw some conclusions about which DCPS students are doing better or worse in writing. (OSSE has not yet released a school-by-school breakdown of composition scores for charter schools, although the schools themselves have received the information.)

For the most part, the DCPS writing score breakdown is what you would expect. Students in Wards 2 and 3 scored a lot higher than those in Wards 7 and 8, with over 75% proficient as compared to about 36% and 30%, respectively. White students were about 80% proficient, Hispanics about 50%, and blacks about 44%.

But there are some puzzling discrepancies. Common sense would indicate that reading and writing scores should track each other fairly closely, and in most cases that's true. But in some cases the two scores diverge significantly.

Perhaps it's not surprising that at 5 schools reading scores were much higher than writing scores, since writing is generally a harder skill to master. At Orr Elementary School, for example, reading proficiency was about 32%, while writing was only about 9%.

What's harder to explain are the 7 schools where the writing score was substantially higher than reading. At Stanton Elementary School, for example, reading proficiency was just under 20%, but writing was at 40%. Even more dramatic was Garfield Elementary. There, only about 15% of students scored proficient in reading, but over 55% scored proficient in writing.

And remember that overall, only 11% of DCPS students scored advanced in reading, while almost twice as many did so in writing.

Subjectivity in the scoring

One possible explanation for these inconsistencies is that scoring a writing test is not as objective a process. Multiple-choice reading and math tests are scored by machines, but the DC CAS writing assessment is scored by human beings.

According to OSSE, these human beings are hired and trained by CTB/McGraw Hill and Kelly Services, a temporary personnel agency formerly known as Kelly Girl. Steps are taken to ensure quality and consistency: all raters must have a bachelor's degree or higher, all are interviewed and screened, and all receive training in the scoring rubric. Raters review sample "exemplary responses," so they know what they're looking for. In addition, about 10% of answers are scored by a second person who doesn't know the score given by the first reader.

Still, it's inevitable that more subjectivity is involved in scoring a test for which there's no one right answer. A criterion like "Fully addresses the demands of the question or prompt," which is part of the scoring rubric, leaves a certain amount of wiggle room.

So it's not really clear what these tests are telling us about the state of students' writing skills. Nor is it clear that DC students will do as well on future standardized writing assessments.

Starting in 2011, DC revised its writing prompts to be more aligned with the rigorous Common Core curriculum, which emphasizes analyzing and interpreting texts. For example, the old sample 10th-grade writing prompt for the DC CAS was: "Who is likely to accomplish morethe person who adjusts to society as it is, or the person who attempts to change it?" In their answers, students were invited to draw on their reading as well as their own experience and observations, but they didn't need to interpret a text.

The new Common Core-aligned sample 10th-grade question for the DC CAS gives students a two-page story to read and then asks questions about how the author conveys a certain message through the characters. That may be more challenging than the previous DC CAS writing test, but it's probably a lot easier than the writing questions students will begin to confront in school year 2014-15.

More complex questions

That's when DC will replace its own test with a test devised by a consortium called PARCC, which DC and 19 states have joined. Like the current DC CAS, the PARCC test will ask students to read and respond to texts. But the texts for the sample 10th-grade PARCC questions are much more complex: an excerpt from a high-flown 1941 blank-verse translation of the Daedalus and Icarus myth from Ovid's Metamorphoses, and a poem on the same subject by Anne Sexton. One of the writing prompts asks students to analyze how Icarus's experience of flying is portrayed differently in the two texts. It's an assignment many college students would no doubt find challenging.

So if writing scores drop precipitously once the PARCC test comes in, that won't necessarily mean students' writing skills have suddenly plummeted. And although we'll be able to compare writing scores in DC to those in other jurisdictions that are using the same test, there may still be some subjectivity in the scoring, since the PARCC writing tests will also probably be graded by human beings.

Using a test to evaluate writing is inherently a tricky business. PARCC has said it is considering using computers to evaluate its writing tests, and the other Common Core testing consortium is actually trying to "train" computers to do the job. But that approach would inevitably bring its own problems.

And some students simply don't write well under pressure. It might make more sense to assess writing skills by means of a student portfolio rather than a time-limited test, but it's hard to see how that could be done on a mass scale.

On the other hand, if we don't test writing, it probably won't get taught. Surely one reason writing has been neglected in recent years is that the tests mandated by No Child Left Behind have focused exclusively on reading and math. And writing is too crucial a skill to be ignored. So the best we can do, it seems, is to administer writing tests and actually pay some attention to the results. But we also need to remind ourselves to take those results with a grain of salt.

Natalie Wexler is a board member at DC Scholars Public Charter School and a volunteer tutor in a DC Public School. She also serves on the board of The Writing Revolution, an organization that brings the teaching of analytical writing to underserved schools. She has been a lawyer, a historian, and a journalist, and is the author of three novels. 

Comments

Add a comment »

one possible explanation for the schools where reading scores are much higher than writing scores is that the school cheated on the reading tests by changing answers. It's not as easy to do that on the writing test. I'd be curious to see if any have suspicious wrong-to-right erasure rates, especially if they've also seen large year-over-year increases in reading scores.

by sbc on Sep 11, 2013 12:35 pm • linkreport

also, some schools are missing from the list you linked to (I didn't see Neval Thomas, Amidon-Bowen, or Garrison). And is there a side-by-side comparison of reading and writing scores for each school? I'd love to see a spreadsheet of each school's scores for the past year or two.

by sbc on Sep 11, 2013 12:41 pm • linkreport

@sbc: Sorry, I don't have a spreadsheet or a side-by-side comparison of reading and writing scores (wish I did!). What I linked to is all I was able to get from DCPS. As for the missing schools, here's the explanation from DCPS: "Since the test is only given to select grades, if school isn’t on the list, it means there are not enough students to register a score."

by Natalie on Sep 11, 2013 3:39 pm • linkreport

I think there is a fallacy here about reading and writing being connected. Reading especially at higher levels requires a great deal of content knowledge and vocabular. Children will often score a year up if they are reading a passage in an area of interest. However writing may have been scored more on the mechanics, is there a main point, supporting points. My understanding of these tests is that they are very mechanistic and we can see more test prep outcome than actual ability.

by DC Parent on Sep 12, 2013 2:41 pm • linkreport

Way back when the writing tests were new I was a test 'scorer' for a season and read thousands of 3rd grade writing submissions to the prompt, "One day an aligator walked into ...".

The scoring was mostly mechanistic and involved checking off elements of writing. Credit was given for advanced language or making the essay more interesting, but a proficient score (at the time) was awarded if the writer had the proper elements of a story or argument on paper in a proper sequence.

Writing proficiency does not require language capability beyond the grade level. The vocabulary requirement for good writing is fairly low. Standardized reading tests at higher grades often impose a very high vocabulary requirement to understand the questions.

As a thought process compare the language of Hemingway vs Proust. Both are excellent writers. Proust is hard to read. Hemingway is easy. A reading test with a Proust excerpt will get lower scores than one with Hemingway -- just because of the difficulty in decoding the excerpt.

Now consider the 10th grader's capabilities and interests and pity any teacher or school that will be tagged with the WRITING score after the new PARCC test is brought on. The Ovid excerpt to be analyzed in the new writing test has a Gunning-Fog grade-level readability score of 11.1. That passage expects/requires 11th grade reading ability just to understand the passage.

Understanding the passage is key to writing an analysis essay. So the new PARCC test, ostensibly one of WRITING ability will most certainly mark down readers who are not at or above grade level.

This is just bad test design.

by Scored Those Tests on Sep 13, 2013 12:31 pm • linkreport

I think you need to be very skeptical of the increase this year. They nerfed the test.

http://dcps.dc.gov/DCPS/In+the+Classroom/What+Students+Are+Learning/DCPS+Common+Core+State+Standards#6

This explains that for this past year they tested what they called "the overlap between the old DC standard and the common core". This means any items in the old dc-cas that was not aligned with the common core they dropped and they did not add any items from the Common Core not in the standard. This is a lower standard. Is it enough lower to explain all the increase this year, don't know. Especially since I can't get the excel sheets on the osse site that go into the detail on the standards to load. I just get an oops over and over when I try to load them.

by Mary Melchior on Sep 20, 2013 10:38 am • linkreport

Add a Comment

Name: (will be displayed on the comments page)

Email: (must be your real address, but will be kept private)

URL: (optional, will be displayed)

Your comment:

By submitting a comment, you agree to abide by our comment policy.
Notify me of followup comments via email. (You can also subscribe without commenting.)
Save my name and email address on this computer so I don't have to enter it next time, and so I don't have to answer the anti-spam map challenge question in the future.

or