Greater Greater Education

There's a test that may give us a clearer picture of student growth, but DCPS is reluctant to consider it

Next year DC students will be taking new standardized tests aligned to the Common Core. Some are urging education officials to adopt a test that will provide a more accurate measure of student growth, but DCPS is reluctant, saying the switch might undermine confidence.


Photo by Duncan Hull on Flickr.

DC's Office of the State Superintendent of Education (OSSE) recently suggested considering a Common-Core-aligned test that would enable it to measure student growth more precisely than the assessment it's currently planning to use. According to the minutes of a recent meeting, however, DCPS Chancellor Kaya Henderson objected because the teachers' union and the public might see the switch as an opening to attack the concept of the Common Core and testing in general.

DCPS Chief of Data and Strategy Pete Weber disputed OSSE's characterization, saying in a statement they were "surprised" OSSE wanted to consider switching tests and that such a switch would "undermine the confidence we have worked so hard to build" in all stakeholders, including unions.

DC, along with a number of states, is currently scheduled to replace its local standardized tests with tests created by the PARCC consortium in the spring of 2015. OSSE made that decision several years ago. PARCC is one of two organizations that are preparing standardized tests to assess students on the basis of the Common Core standards.

Some other states have chosen to use tests from the second consortium, Smarter Balanced. Unlike the PARCC test, the Smarter Balanced test is "computer-adaptive," meaning that the questions change depending on a student's level of ability. Many advocates believe the promise of data-driven school quality and accountability hinges on the ability to measure student growth in this way.

The tests being created by PARCC are similar to the local test, the DC CAS, in that there is a different test for each grade. Because the different tests can't be compared, they don't measure a student's actual growth from year to year. Instead we have to settle for a measure known as percent proficiency.

(PARCC disputes this characterization of their test. See update below.)

We hold schools accountable by comparing different groups of students from year to year based on the percent who achieve a designated proficiency score. That makes it difficult to determine whether gains in test scores are due to improvements in student achievement or to gentrification and rising incomes, since more affluent students generally get higher scores on standardized tests.

Last year 39 DC education activists, including myself, signed a letter arguing for a switch to computer-adaptive assessments that measure student growth, like the tests being developed by Smarter Balanced.

A computer-adaptive test can determine that an 8th-grader has a 4th-grade reading level and provide questions geared to that student's level. It can also identify a student performing above grade level and determine how advanced that student is.

The PARCC test, on the other hand, is a "fixed-form" test, with a set number of unchanging items. It can work well for assessing students who are on grade level, but experts say it doesn't do a good job of measuring the knowledge of students at one extreme or the other. Many students in DC's high-poverty schools perform below grade level.

Advocates of computer-adaptive testing say it equips teachers with individualized data about each student's growth. Teachers and schools that struggle with low-income populations would finally get credit if they advance their students' performance, whether or not the advancement moves them to a designated proficiency level.

Discussion at OSSE meeting

On February 3rd, OSSE officials met with representatives from 3 charter networks and an unnamed DCPS official. In testimony before the DC Council, State Superintendent Jesus Aguirre identified the official as Chancellor Henderson.

Those present at the meeting discussed testing alternatives, among other topics. Minutes of the meeting that were recorded by an OSSE official were provided to Greater Greater Education and were confirmed as being accurate by a charter representative who was at the meeting.

According to the minutes, "OSSE discussed their intentions to engage in a series of stakeholder discussions with regards to the choice of common core next generation assessments available to the District. OSSE stated that it will make a public decision by March 3rd."

The minutes record that a representative from KIPP DC said that "PARCC will be a disaster," in KIPP's view. But Henderson stated "that they [DCPS] remain committed to PARCC." The minutes go on to explain Henderson's concerns.

DCPS is concerned that switching or publicly contemplating a switch will make the District look "wishy-washy", and is concerned that a switch would be a moment of weakness that the unions may capitalize on to argue against common core and assessments in general.
Henderson's concern about union opposition appears misplaced, given that Washington Teachers Union president Elizabeth Davis signed the letter from education activists supporting a move to computer-adaptive testing and growth metrics.

DCPS Data Chief Pete Weber replied in a statement that OSSE's minutes do "not capture DCPS' concerns related to PARCC".

As an organization, we continually work to build confidence in our staff, parents, teachers, union stakeholders, and students. We are worried that making changes in assessments without a clearly articulated reason for making those changes will undermine the confidence we have worked so hard to build. We have repeatedly expressed these concerns to the OSSE and it is unfortunate that they did not provide full context for our concerns in discussing our beliefs.
Weber went on to list several steps that show they "have already invested heavily in time and resources that are specific to PARCC," including teacher training and technology procurement. He also said that "the OSSE has not been able to articulate a specific reason to make a change now," and that DCPS has consulted "national experts" who "have echoed our opinion that there is no new information at this time to inspire a change in decision."

Problems with fixed-form tests

The activists identified several problems caused by static assessments like DC CAS and PARCC.

  • Without growth metrics on a student level, teachers and principals can't see when students begin to slip in their performance and target their interventions accordingly.
  • Teachers know which students are closest to the percent proficiency bar, and are strongly incentivized to focus on those students over others. Extensive research has confirmed that static metrics can lead to this result. It's known as "parking," because students whose scores are far from the next rung on the ladder are "parked" while others are advanced.
  • A teacher who works hard to move a 10th-grader from a 5th- to an 8th-grade reading level gets no credit for this achievement. In fact, focusing on such students may actually result in a loss of compensation.
  • Until we have growth metrics, we don't know which schools and which classrooms are actually adding value and which ones are simply benefiting from rising average incomes of families in DC.
Some PARCC advocates claim that its more traditional fixed-form model does allow for growth metrics, albeit with broader margins of error than computer-adaptive testing. And PARCC Director of Policy Jeff Nellhaus says Smarter Balanced is using the greater number of questions available on its tests to more precisely measure students who are on grade level, not to ask students questions geared to different grade levels.

However, an official with the Educational Testing Service, Nancy Doorey, was more cautious. In an email, Doorey said that ETS will "need to see the results of the field tests before they can determine whether [the PARCC] approach will suffice."

So far, OSSE has held no public hearings on which type of test to use, and neither has DC's State Board of Education. The CEO of PARCC, Laura Slover, is also the Ward 3 representative to the DC State Board of Education, though she has recused herself during discussions of PARCC.

Shying away from a conversation about the testing options makes no sense. And while DCPS may have invested in preparing teachers, students, and the public for PARCC, that doesn't justify sticking to a decision if it's the wrong one. The best approach, and the one that is fairest to students, is to have a discussion that provides the public with an opportunity to compare all "next generation" testing methods.

Update: PARCC officials dispute the characterization of their test, saying that they will report a growth metric using data from fixed form tests. According to Communications Director David Connerty-Marin, "PARCC does not compare proficiency from one year to the next as a way of measuring growth. There are a number of accepted ways it can be done. One is to chart where a student is expected to be based on two or three years of test data and then see how the student does compared to that expectation. There are other methods, too, that are more than simply a comparison of proficiency scores."

Ken Archer is CTO of a software firm in Tysons Corner. He commutes to Tysons by bus from his home in Georgetown, where he lives with his wife and son. Ken completed a Masters degree in Philosophy from The Catholic University of America. 

Comments

It appears that DCPS is more concerned with trying to perpetuate the idea that they're making progress than actually worrying about progress itself.

As a former teacher for DCPS, I will tell you that the parking phenomenon discussed in this article occurred frequently when I taught for the schools between 2008-2012. As a young, data-oriented teacher, it was quite disheartening to see data-use in a manner that actively privileged "cusp" children over kids at the the bottom. That's not how data is supposed to be used. It's exploiting the practice.

I remember having to fill out forms every year called "IAPs" or individual achievement plans. These forms were made for the so-called cusp students in order to ensure that they specifically received attention to subsequently maximize their point potential. The kids at the bottom, well, same old same old for them. Sad. We had to sit in meetings with our school's principal and "guarantee" our boss that the cusp kids would move levels. They were usually intimidating one on one meetings. Still have the forms on my hard drive...

by Glenmonster on Feb 26, 2014 6:36 pm • linkreport

What's the cost and time difference between the PARCC test and the computer-adaptive test you recommend?

The tests being created by PARCC are similar to the local test, the DC CAS, in that there is a different test for each grade. Because the different tests can't be compared, they don't measure a student's actual growth from year to year.

So the computer-adaptive test asks the same questions from grade to grade? How does that work?

by MLD on Feb 27, 2014 9:38 am • linkreport

Thank you for this interesting post. If we have to have so much testing, please let it be meaningful. What is the best way for us to weigh in with DC in favor of moving to a computer-adaptive test? Thanks.

by Susanna on Feb 27, 2014 10:26 am • linkreport

Add a Comment

Name: (will be displayed on the comments page)

Email: (must be your real address, but will be kept private)

URL: (optional, will be displayed)

Your comment:

By submitting a comment, you agree to abide by our comment policy.
Notify me of followup comments via email. (You can also subscribe without commenting.)
Save my name and email address on this computer so I don't have to enter it next time, and so I don't have to answer the anti-spam map challenge question in the future.

or

Support Us