July 31, 2013

It depends on how you look at it

Collapsing lots of variables into a single ‘goodness’ score always involves choices about how to weight different information; there isn’t a well-defined and objective answer to questions like “what’s the best rugby team in the world?” or “what’s the best university in the world?”.  And if you put together a ranking of rugby teams and ended up with Samoa at the top and the All Blacks well down the list, you might want to reconsider your scoring system.

On the other hand, it’s not a good look if you make a big deal of holding failing schools accountable and then reorder your scoring system to move a school from “C” to “A”. Especially when it’s a charter school founded by a major donor to the governing political party.

Emails obtained by The Associated Press show Bennett and his staff scrambled last fall to ensure influential donor Christel DeHaan’s school received an “A,” despite poor test scores in algebra that initially earned it a “C.”

“They need to understand that anything less than an A for Christel House compromises all of our accountability work,” Bennett wrote in a Sept. 12 email to then-chief of staff Heather Neal, who is now Gov. Mike Pence’s chief lobbyist.

 

avatar

Thomas Lumley (@tslumley) is Professor of Biostatistics at the University of Auckland. His research interests include semiparametric models, survey sampling, statistical computing, foundations of statistics, and whatever methodological problems his medical collaborators come up with. He also blogs at Biased and Inefficient See all posts by Thomas Lumley »

Comments

  • avatar
    megan pledger

    I tried to get the “Survey Appraisal and Public Questions Committee” of the NZSA involved with the “value added models” that have been used in New York and Los Angeles to publicly “grade” teachers because National have been dropping hints in the blogsphere that they want to use that method here (and National Standards and PaCT are all hops in that direction). It would have been good to have a local resource on hand to put in front of National when they try to go down that road.

    The “value added models” are even more opaque than the school grading system as it’s model based with dodgy covariates and masses of data errors and missing data.

    Anyway, I found this article interesting…
    “Mathematical Intimidation: Driven by the Data” in the Notices Of The American Mathematical Society by John Ewing
    http://www.ams.org/notices/201105/rtx110500667p.pdf

    11 years ago

    • avatar

      Nice article, but I laughed out loud when I saw the ad for the NSA on the last page of it. A very interesting placement considering the article started with the G H Hardy apology. NSA = “the nation’s largest employer of mathematicians”. Wow. All those people who can explain the base rate fallacy to the administrators who want them to spy on everybody.

      11 years ago

    • avatar
      Lee Sechrest

      I do not know of the interest of mathematicians in Value Added Modeling. Statisticians and other researchers using pretty standard statistical methods do VAM. Mathematics is a tool that is available to anyone who cares to learn. Richard Feynman once noted that physicists are not dependent on mathematicians. They do their own math. So do statisticians.
      (I am neither a mathematician nor a statistician.)

      11 years ago

      • avatar
        megan pledger

        Just because a method is available and you can use it on data doesn’t mean that it is the right method to use on that data.

        11 years ago

  • avatar
    Joseph Delaney

    I am giving Megan Pledger’s comment a plus one. You first need to know that a specific score is calibrated with an outcome. When the outcome is really hard to measure (post-school success or happiness, for example) then it can be hard to ensure that a change in the score represents a change in the outcome you care about.

    11 years ago