Summer Help in Math

** Tutoring Program in Mathematics:
College, High School, Middle School and Elementary:
Call Rae Lynn Westby at
Do the Math
for a free consultation: 509.325.MATH.

** Do your children need outside help in math?
Have them take a free placement test
to see which skills are missing.

Sunday, September 11, 2011

Lies, damn lies and the myth of "standardized" tests

[Note from Laurie Rogers: Recently, results from the 2011 state standardized test scores were released. The impression given to the public from the state education agency (OSPI) and from media in Seattle and in Spokane was that improvements had been made. Did some of the numbers go up? Assuredly. Did that mean that real improvements in real academic knowledge had been made? It's best to remain skeptical.

Most students in Spokane are as weak in math skills this year as they were last year. Given a proper math test that assesses for basic skills, many high schoolers still test into 4th or 5th-grade math. College remedial rates are still high. Parents are still frantic, and students are still stressed out about math. What do the numbers actually mean?

Calculators were allowed on some of the tests, and not all standards were tested. There also is the matter of the cut scores. (Cut scores are the level at which a grade on the test is a passing grade.) On the math HSPE, students needed less than 60% to pass, but what does that mean? It's hard to say. Some of the questions tested students at a level of "proficiency," and some tested students at a "basic level." Students could pass at a "basic" level while failing to answer all questions that didn't test at that level. A student's grade, therefore, isn't a straight grade. No one can say what a grade means in terms of actual knowledge. What's the point of issuing grades no one can understand? The point should be to engage in accountability and transparency, and to help the children.

There is another problem, however. Guest Marda Kirkwood, from CURE (Citizens United for Responsible Education), argues that Washington's "standardized tests" aren't standardized at all.]

**************************************

Lies, damn lies and the myth of “standardized” tests


By Marda Kirkwood

"The glory which is built upon a lie soon becomes a most unpleasant incumbrance [sic]. How easy it is to make people believe a lie, and how hard it is to undo that work again!" - Mark Twain in "Eruption"

So here I go, taking on the monumental task of undoing a lie that is repeated so often in the media, by elected officials (who should know better), and educrats (who, I am convinced, do know better) that it has come to be generally accepted as fact.

Here is the lie: The High School Proficiency Exam (HSPE), the Measurements of Student Progress (MSP), and their precursor, the Washington Assessment of Student Learning (WASL) are “standardized” tests.

What, logically, are the characteristics that would allow us to legitimately label a test as “standardized”? It doesn’t take much thought to come up with a few guidelines.

A “standardized” test must:
  • Be completely objective. There can be no judgment involved in determining whether an answer is right or wrong.
  • Have specific time constraints. We all remember, “Time is up! Put your pencils down now.”
  • Receive the same score no matter when, how, or by whom it is scored. It should not matter if it is a Monday or a Friday, or if test scorers are having a good or a bad day.
  • Ask all students of the same grade standardized questions, every year.
  • Reliably measure what academic knowledge the student knows.
  • Serve as an accurate guideline for teachers, principals, school board members, parents, and others to evaluate the quality of curricula and instruction.
  • Be “valid and reliable”. These are terms that are roughly analogous to accuracy and precision in target shooting. Valid means they accurately measure what they are intended to measure. Reliable means they are consistently valid (produce the same score). And just because some government official or entity declares a test “valid and reliable” does not make it so.
  • Be norm-referenced. That means the scores have been compared, via a bell-curve, with the scores of other students across the nation who also took the same test. The comparison score is shown as a percentile. A student who receives a 70th percentile score performed better than 70% of the other students in the nation who took the same test.
This also gives us a pretty good idea what real standardized tests are not. Here are some characteristics of an assessment that is not a standardized test:
  • It includes essay or short answer, which have to be subjectively hand-scored and require a fallible human to use judgment to determine a score. Repeat after me, “subjective.”
  • It has lots of variation in test questions from year to year, changing types and difficulty levels.
  • Students are allowed to take however long they feel like to complete the test.
  • A committee votes on a “cut score” (i.e. what level is “passing”) after the tests have been scored.
Guess which list describes the HSPE/MSP/WASL? Yup, these assessments include all the features that characterize what standardized tests are not. These are (theoretically, anyway) standards-based assessments, very different from standardized tests. They are supposedly designed to measure the Essential Academic Learning Requirements (EALRs) – our state learning standards. That is debatable, but food for another topic.

The EALRs are a moving target, able to be changed by the Superintendent of Public Instruction (SPI), without the consent of the Legislature, as Terry Bergeson did when she was SPI. We have to assume they measure the EALRs because that is what federal law requires under No Child Left Behind. But the results are meaningless when they can be so easily manipulated by the choice of questions, the choice of cut scores, and the directions to the scorers.
The list of standardized tests includes the Iowa Test of Basic Skills (ITBS), the Comprehensive Test of Basic Skills (CTBS), the Metropolitan Achievement Test (MAT), and the Stanford Achievement Test (SAT – not the college entrance one, which is the Scholastic Aptitude Test). Even these are beginning to become WASL-ized, as some of them have started adding writing sections. Washington eliminated the use of norm-referenced, standardized tests in 2005.
During the few years when Washington used both the WASL and a standardized test, WASL scores steadily increased, while the scores on the standardized test remained flat. Are you now surprised that the use of the standardized tests was eliminated?
Another little fact for your file, in the era of recession and government cutbacks: Standardized tests are very cheap, as they can be scored by a computer in seconds. Standards-based assessments (since they must be scored by armies of humans) cost a bundle. In 2003 the ITBS cost $2.88 per student. The same year, the WASL cost $73 per student. Oh, and that $73 only includes the cost of printing and scoring the assessments. No costs are included in that figure for development, administration of the test to the students, or test security. It also doesn’t include what economists call opportunity costs. All the many hours students are forced to spend on test preparation and practice tests are lost opportunities to learn new things.
So, next time you hear someone refer to Washington’s “standardized” tests, remember what you learned here. And explain it to your neighbors. The only way to combat lies is with the truth.


Marda Kirkwood is active with Citizens United for Responsible Education, the founding chair of CURE, and an advisor to the current board. She has a degree in chemical engineering (with honors) and worked in the field for five years before retiring to be a stay-at-home mom. She homeschooled her children for eight years. This article originally was published Aug. 7, 2011, on the CURE Web site. For permissions for this article, please contact Marda at Marda@curewashington.org



From Laurie Rogers: If you would like to submit a guest column on public education, please write to me at wlroge@comcast.net . Please limit columns to not more than 1,000 words. Columns might be edited for length, content or grammar. You may remain anonymous to the public, however I must know who you are. All decisions on guest columns are the sole right and responsibility of Laurie Rogers.

2 comments:

Vain Saints said...

This is nitpicking, but surely someone concerned about maintaining writing standards should hesitate before presenting her readers with topics that eat. "Food for another topic"? Ugh.

Anonymous said...

Thank you for spreading the truth about Washington's NCLB assessments.

In addition to all the items that were listed as not being standard on the tests, Washington's assessments also include "test" questions for new questions to be added on future tests. These "test" questions are not labeled and supposedly not scored but I assume they are actually just taken out of the final cut score calculations.

In 2006 my kids talked with friends in their classes and friends at other schools in our district and discovered that the questions they received on the exams were different.

I questioned Joe Wilhoft at OSPI about the different questions and was told they were just "test" questions and it was no big deal to include them on the test and that those questions were not considered in the score. After a little more digging, I heard from a credible source that people grading the tests were not told to ignore the "test" question answers when figuring out the final score. Unfortunately, I was never able to verify that information for myself and didn't even realize at the time that having proof of that would have been a good idea.

In addition, NCLB requires that ALL students take the SAME test on the SAME day but I guess Washington didn't really care about possibly stumping a student with a poorly written "test" question that could throw the student off for the rest of the test.