Summer Help in Math

** Do your children need outside help in math?
Have them take a free placement test
to see which skills are missing.

Monday, December 28, 2009

School district excludes feedback from mathematicians

By Laurie H. Rogers

Last September, Spokane Public Schools created an adoption committee that was to choose a new high school mathematics curriculum. For the first time perhaps ever, parents, students and community members (including yours truly) were allowed to participate. The committee met six times in 2009. At our Dec. 9 meeting, we chose two strong finalists from eight possibilities, and we did it despite the district’s complete mishandling of the adoption process. Administrators and “facilitators” wasted our time and tax dollars on useless activities; minimized or excluded feedback from parents, teachers, students, and committee members; and continually showered us with extreme reform propaganda.

On Dec. 9, the district’s interference went to a whole new level. On that day, members of the adoption committee were barred from determining whether the eight possible curricula met the most crucial requirement on our list: “Alignment to State Standards.” Instead, we were told to use a previous assessment from the Washington State Office of Superintendent of Public Instruction. We were assured that if OSPI ranked a curriculum as being closely aligned to the state standards, the program was assumed to be “accurate, rigorous, and high quality.”

This assurance necessarily presumed that OSPI's assessment was done thoroughly, correctly and without bias. But that presumption is on shaky ground.

In 2008, under former State Superintendent Terry Bergeson, a team put together by OSPI assessed 13 high school math curricula (3 texts each), mapping each to the new state standards. There wasn't time for careful assessments of "mathematical soundness." The top four were:
1. Holt
2. Discovering Algebra/Geometry
3. Glencoe
4. Prentice Hall

Core-Plus Mathematics Project, Spokane’s current high school math curriculum, placed sixth overall. Core-Plus is a reform program, and its text is arranged in an “integrated” fashion. Of the textbooks with a “traditional arrangement of content,” (algebra/geometry/algebra), Holt placed second in algebra and first in geometry. Meanwhile, Discovering Algebra/Geometry placed first in algebra, but sixth in geometry.
(Although Discovering Algebra/Geometry contains a “traditional arrangement of content,” it isn’t a “traditional” textbook. The texts are heavily constructivist, with constant group work and student "discovery.")

OSPI also asked Drs. James King and George Bright to assess the top four curricula for mathematical soundness. Both men had potential conflicts of interest.
  • Dr. Bright has a Ph.D. in mathematics education (not in mathematics) and has advocated for reform math. At the time, he worked for OSPI. He also was part of the assessment team for reform curriculum Connected Mathematics.
  • Dr. King has a Ph.D. in mathematics and is an author for Key Curriculum Press (Key Curriculum Press is the publisher for Discovering/Algebra/Geometry).
Only Dr. Bright reviewed the algebra textbooks. Only Dr. King assessed the geometry textbooks, but he assessed McDougal-Littell instead of Discovering Geometry because the latter had scored too low in OSPI’s initial assessment to be considered for this additional assessment. Dr. Bright found Holt and Discovering Algebra to be the best in algebra; Dr. King found Holt and Prentice Hall to be the best in geometry.

OSPI released a preliminary recommendation to the State Board of Education (SBE). Legislation required the SBE to review the recommendation before OSPI issued a final recommendation to the school districts and to the general public. The top 3:
1. Holt
2. Discovering Algebra/Geometry
3. Core-Plus Mathematics

I’ll bet you’re wondering how Core-Plus snuck in there. I wondered the same thing. In January 2009 I asked OSPI why Core-Plus was recommended over other, better curricula. Greta Bornemann, OSPI’s math director, told me that Randy Dorn, the new superintendent, wanted to have at least one integrated curriculum in the top three.

And so OSPI initially chose to recommend Core-Plus (despite the entire series being widely panned) and Discovering Algebra/Geometry (despite the geometry portion of that series being widely panned). The SBE meanwhile had contracted with Strategic Teaching, Inc. to have the top four curricula assessed by other independent mathematicians. For this assessment, the fourth-ranked curriculum – Prentice Hall – was passed over so that sixth-place Core-Plus Mathematics could be assessed in depth.

Two mathematicians – Dr. Stephen Wilson, Johns Hopkins University, and Dr. Guershon Harel, University of California, San Diego – determined that Core-Plus and Discovering Algebra/Geometry are indeed mathematically unsound. Holt – while not thought to be fabulous – was the only one of the four found to be mathematically sound in all categories assessed. Following this process, OSPI issued its final recommendation to the public. Just one high school curriculum was recommended: Holt Mathematics.

This fall, Dr. Bridget Lewis, Spokane’s executive director of instructional programs, told parents in two community forums that the mathematicians conducting the state reviews did not agree on the results. This is a partial truth. All four of the in-depth reviewers (Drs. Wilson, Harel, Bright and King) chose Holt Mathematics in their final summary. Drs. Wilson and Harel also agreed on the unsound nature of Discovering Algebra/Geometry and Core-Plus Mathematics. Dr. Lewis and Rick Biggerstaff, Spokane’s secondary math coordinator, knew about the additional in-depth assessments, and also about OSPI’s sole recommendation of Holt, yet they still forced the curriculum adoption committee to use OSPI’s original, cursory scoring.

On Dec. 9, I asked Rick Biggerstaff why they did that. I mentioned the in-depth assessments from Drs. Wilson and Harel, plus another done by mathematician Dr. John Lee, University of Washington (who also found Discovering Algebra/Geometry to be inadequate). Rick Biggerstaff brushed off my concerns, saying the assessments from Drs. Wilson and Harel were only about “mathematical soundness,” not “alignment.” Pointing to OSPI’s original scoring, he repeatedly stated, “We’ve decided we’re going to use this.”

But why? Why would Spokane administrators insist on using OSPI’s original scoring when its results conflict with later in-depth assessments? The most notable aspect of OSPI’s original scoring is that the OSPI team ranked Discovering Algebra/Geometry – a highly constructivist (discovery) program – as second overall despite its dismal scoring in geometry. Perhaps Dr. Lewis and Rick Biggerstaff didn’t bother to become informed about the in-depth assessments. Or, perhaps their unstated agenda was to keep a constructivist program in the running despite its known inadequacy. Perhaps both. Are there other possibilities?

Despite all of this, a majority of the members of Spokane’s adoption committee stood tall on Dec. 9 and chose Holt Mathematics and Prentice Hall as the two finalists. We did it based on our familiarity with mathematics, our experience in mathematics instruction and tutoring, and the desires of the community we serve. I’m proud of the committee. Now, if we can successfully navigate Spokane’s brief pilot of Holt and Prentice Hall, the district’s final recommendation to the school board, the school board vote, and the funding of the new math curriculum, we’ll really be getting somewhere.

Please note: The information in this post is copyrighted. The proper citation is: Rogers, L. (December, 2009). "School district excludes feedback from mathematicians." Retrieved (date) from the Betrayed Web site:

Thursday, December 17, 2009

School district excludes feedback from parents, teachers

Statement from Laurie Rogers on the feedback (part 2):
Spokane Public Schools’ pre-selection criteria for its new high school math curriculum are purportedly based on summaries of feedback from parents, students and teachers.
Careful diligence was required, therefore, in the collection and summarizing of this feedback. Unfortunately, the processes were so poorly conducted as to render the District’s summaries of the feedback virtually worthless.
It's a shame. Parents and teachers who offered their thoughts appear to care deeply about the issue. It’s disgraceful that so many of their thoughts were rewritten, minimized, reinterpreted, questioned, doubted and “summarized” right off the page.
(In spite of this, the adoption committee members did appear to take the community feedback into account, overwhelmingly voting for Holt Mathematics or Prentice Hall Mathematics as their top choice. These two curricula will now undergo a more in-depth assessment, including a brief classroom pilot, before a final recommendation is made to the school board.)


Parent Feedback Contaminated; Parent/Teacher Feedback Excluded:

In November, SPS hosted two community “forums” on the adoption of a new high school math curriculum. Those who came to the forums listened to a 50-minute District presentation, then were asked to write down on 3”x5” cards what they want from a new high school math program.
District staff made no attempt to differentiate between parents who work for the District and parents who don’t. There was no attempt to collect feedback only from parents, or only from parents of students in the district. The forums were attended by District staff, board members, university professors, District teachers, and math coaches – including people on the curriculum committee. Cards were handed out across the room. I was given cards at both forums.
Thus, the parent feedback was contaminated from the start.

The next week, I went back to the District office and looked through the cards collected at the two forums. I saw an interesting dichotomy.
Most of the cards are clear about a desire for a more traditional approach. They variously ask for “traditional” math, “basic” math, examples, direct instruction, practice, review, standard algorithms, a textbook, mathematical principles, skill proficiency (without calculators), level-appropriate material, tutoring, individual work, a dual track, alignment with state standards, a reference set of formulas, workbooks, and clarity.
On the polar opposite are a few cards asking for connections, explorations, conceptual understanding, application, and real-world context. Looking at these cards, one might think regular parents left their home at dinner time so they could drive to a district high school and use typical educator language to ask the District for more reform math.
Just one card in all of the so-called “parent” cards specifically asks for a “balance” between conceptual and procedural skills, yet this one word became the framework for the District’s summary of the “parent” feedback.

Teacher feedback also was solicited in the same casual, unscientific manner. The two most common teacher requests are “examples” and more opportunities to practice skills. Close behind are requests for context, conceptual understanding or application. Also popular are requests for close alignment with the new Washington State math standards.
A dichotomy is present in the teacher cards, too, however this dichotomy probably is legitimate. Some teachers clearly want a more traditional approach, asking for equations, algorithms, step-by-step instruction or examples for the students, basic skills, proficiency with skills such as algebra, a logical sequence to the material, and no integration of concepts.
The other group wants to stick with reform, asking for investigations and a student-centered, constructivist classroom.
The incompatibility between these philosophies was never discussed in any adoption committee meeting. Quite the contrary. All efforts to discuss it were squelched by the people running our meetings. The way these people consistently handled any disagreement over “reform” vs. “traditional” was to change the subject or substitute the word “balance,” as in “a balance between,” or a “balanced approach” – even if that wasn’t what was said.

On Dec. 3, adoption committee members were asked to go through the parent and teacher cards. We were divided into four groups and asked to “silently” lump the cards into “three to five” categories and then “come to a consensus about a phrase to describe each category.” At my table - a "parent" table - I was surrounded by administrator types, and we didn't have consensus.
“The parents want a textbook,” I said at one point to the Administrator In Charge of the Pen.
“I think it’s implied,” he said, refusing to write the word.
We argued back and forth. “Look,” I finally said, exasperated, showing him the parent cards. “’Textbook.’ ‘Textbook.’ ‘Textbook.’ Just write it down.”
In the course of this process, requests for a more “traditional” approach were excluded. I asked the Administrator In Charge of the Pen to note the disagreement on the poster paper, that some of the parent cards asked emphatically for “traditional math.” Instead, he added words that ultimately fostered the impression of parent requests for balance.

On this day, we were given the opportunity to walk around the room and add notes to other summaries if we thought something was missing. I heard some administrators question what parents or teachers meant by “basic math,” “traditional math” or “standard algorithm.” I wondered what we all had to say before our desire for Math-That-Is-Not-Reform was taken seriously.
When we returned to our tables, we (as in “Not Laurie”) could permanently add additional comments if we thought they were “needed.” At my table, all additional sticky notes were plucked back off.
“You’re removing what the parents told you,” I said to the offender. She was unmoved. “This is supposed to be through our eyes,” she said.

This is the District’s “summary” of what parents requested: “Parent support; student support; practice – a lot; resources for help; real-life or contextual problems; basic skills; balanced content – align with state standards/college readiness; balanced between skills and concepts (some procedural, some contextual, not overly emphasize technology); parent/home/on line resources (textbook); user-friendly with numerous examples, (cleaner, less cluttered appearance, consistent layout).”
The teacher "summary" is strikingly similar to the parent summary. Missing from both are words like “standard algorithm,” “direct instruction” and “traditional math,” even though some committee members added them after seeing them on the cards.
Two members even acknowledged to the ESD101 facilitators that respondents aren’t in sync on a “balanced” approach. That acknowledgment isn’t reflected in the final summaries.

The missing words also don’t show up in the pre-screen criteria. The word “balance” is there, however. Also there is “socially equitable/just for the broad scope of student experiences,” even though no parent, teacher or student feedback card asked for that. In the next article, I’ll tell you about the adoption committee’s pre-screen criteria, and how they shaped – and didn’t shape – the curricula choices that were made.

Please note: The information in this post is copyrighted. The proper citation is:Rogers, L. (December, 2009). "School district excludes feedback from parents, teachers." Retrieved (date) from the Betrayed Web site:

Sunday, December 13, 2009

School district excludes feedback from committee, students

Statement from Laurie Rogers on the Feedback (Part 1):
Spokane Public Schools is in the midst of replacing "Core-Plus Mathematics," its current high school mathematics curriculum. The adoption committee’s pre-screen criteria for a new curriculum are purportedly based on summaries of feedback from parents, students and teachers.
(Or so we were told by SPS administrators and two “facilitators” hired from Educational Service District 101).
Careful diligence was required, therefore, in the collection and summarizing of this feedback. Unfortunately, the processes were so poorly conducted as to render the District’s summaries of the feedback virtually worthless.
It’s too bad. Parents, students and teachers who offered their thoughts appear to care deeply about the issue. That their desires are so misrepresented by the District and ESD101 facilitators indicates an unprofessionalism and a lack of respect that I find appalling. But not surprising. Feedback from the adoption committee was misrepresented, too. Then it was tossed out.

District Throws Out Committee Feedback:
The curriculum adoption committee met five times from Sept. 29 to Dec. 3. Each time, we were to share ideas, preferences and concerns and then record summaries of our discussions on sticky notes and poster paper. We disagreed on various issues, so the poster papers reflected oppositional viewpoints.

In our Nov. 9 meeting, committee members were given a typed “perspective” of all of our written feedback to that date. That “perspective,” particularly a section called “Desired Outcome,” seems different from what I remember of the conversations. (Another committee member echoed this thought.)
Returning home on Nov. 9, I emailed Bridget Lewis, executive director of instructional programs, asking her to keep original artifacts handy. I received no reply.
On Nov. 12, I went to the District office and asked to see the artifacts. I was given Nov. 9 poster papers only. When asked for the others, Bridget Lewis and another staff member said they didn’t know where the other artifacts were. On Nov. 13, the staff member confirmed that committee feedback from September and October was “typed up and then tossed.” No apology or explanation was given.

Today, the District’s “perspective” on committee feedback doesn’t mention certain comments, words or phrases from some of the committee members. “Traditional math,” “direct instruction” and “standard algorithm,” for example, aren’t there.
One oppositional viewpoint was pretty much eliminated.

Student Feedback Is Ignored, Excluded:
This fall, SPS asked middle and high school students to share their desires for a high school math curriculum. More than 400 feedback cards were collected.

At the Nov. 9 adoption committee meeting, the ESD101 “facilitators” told members to write down categories for what we thought the students would want. Then the students’ feedback cards were spread out over three desks. We were divided into three groups and told to “silently” assign each card to one heading.
(Most of us didn’t have a chance to see more than our third of the cards. The headings were ours, chosen before we ever saw the cards. The cards contained multiple requests, yet each card was placed under a single heading. From the start, therefore, much of the student feedback was destined to be excluded.) The facilitators then asked for initial committee impressions of the student desires, and the resulting list said: “Good examples; Resources for help: glossary, Website, answers, toolkit; Easy to read and understand; Real-life content (how will I use?); Lots of practice; Technology.”
Committee members didn’t have another chance to look at the student cards, so this initial impression stood, as if it were some kind of proper analysis.

But I had promised my daughter that the student voice would be heard. On Nov. 12, I went to the District office and photocopied the student cards, took the copies home, and over a few days, categorized each student comment according to similar language.
My analysis isn’t an exact science, and it can be argued that, because the method of collecting data was unscientific, any tabulated results are bogus. The question asked of the students was not standard. The students were not given a survey with standard choices, explanations or definitions. I did not speak to the students nor have a chance to clarify their exact intent. Their comments came from their own lexicon and could have meant anything. It’s why I left the results in their own words.
Still, the student comments are consistent. The two most commonly requested items by far are “more examples” and variations of “I need explanations of how to do it.”

On Dec. 3, I gave my results to each member of the committee. When I asked for “explanations” to be added to the District’s “summary,” an ESD101 facilitator said it’s the same thing as “examples” and that I was “splitting hairs.” She didn’t add the word. I asked for “technology” to be removed, since very few students asked for that (29 did say they like their calculators). She refused to remove the word.
Later, I persisted with the two facilitators: “What’s the point of asking the students their views if you aren't going to write down what they said?” Finally, one of them agreed to add the word “explanations,” and she placed a tiny question mark next to “technology.”
The next morning, on Dec. 4, I received an email from Bridget Lewis, telling committee members how happy she was with our effort ... and by the way,

“One caution...when we requested this feedback from these three groups, we did not indicate to them that these comments would be public. This is the reason for only posting our summary of the perspective. Displaying individual card statements publicly would not be appropriate since we did not make that known at the time of the request for input.”

I pondered this email. In my table and summary on student feedback, I don’t have names, grades, classes, schools or programs. There is no identifying information. The table simply collects “like” comments and counts them. The original cards had been spread out on tables in a curriculum adoption meeting that was open to the public. Committee members had viewed the cards and/or openly discussed them in two public meetings. The cards had been taken to the District’s central office where they were viewed by more than one person and kept openly on at least one administrative desk. I was allowed to photocopy the student and parent feedback cards and also to take those photocopies home. Now, suddenly, this information is no longer public? Meanwhile, the District has published some of the students’ exact language.

Well, I am a rule follower, even if I think the rules were put in place solely to squelch debate, foster a predetermined viewpoint, or keep pertinent, critical information from seeing the light of day.
Following is my summary of the top student requests, in order, from most commonly cited to least. I presume that, where the District published exact student language, they did it “appropriately,” and so I used the same student language, placed in “quotes.” I paraphrased the rest.

Laurie Rogers’ Summary of the Top Student Requests

Students said they want:

  • More “examples”
  • “Explanations,” line by line, of how to do each skill
  • Helpful “resources” within the textbook structure, such as the meanings of words, “answers,” “glossary,” directory, lists of mathematical procedures, explanations of mathematical symbols
  • Clearer and simpler language, “easier to read and understand”
  • Classical math – the math schools used to teach, the math that will get them to college without remediation – with “equations, algorithms, formulas, theorems”
  • Useful “visuals”
  • Uncomplicated word problems; (or) No more word problems
  • Content that’s germane to them, to their life, to college and to their future
  • More time and opportunity to “practice” skills
  • Small, portable machines that will calculate for them
  • The paid adult in the classroom to actually show them how to do things
  • To be allowed to progress when they understand something
  • Help from a “Website”
  • To learn a skill before they’re told to use it
  • A textbook that isn’t so big and heavy
  • A book they can work in at home

In the next article, I’ll tell you what parents and teachers asked for, and what the District says they asked for. The parent and teacher requests, and the District’s summaries of their requests, are not the same.

Please note: The information in this post is copyrighted. The proper citation is:
Rogers, L. (December, 2009). "School district excludes feedback from committee, students." Retrieved (date) from the Betrayed Web site: