March 25, 2013

Stat of the Week Competition: March 23 – 29 2013

Each week, we would like to invite readers of Stats Chat to submit nominations for our Stat of the Week competition and be in with the chance to win an iTunes voucher.

Here’s how it works:

  • Anyone may add a comment on this post to nominate their Stat of the Week candidate before midday Friday March 29 2013.
  • Statistics can be bad, exemplary or fascinating.
  • The statistic must be in the NZ media during the period of March 23 – 29 2013 inclusive.
  • Quote the statistic, when and where it was published and tell us why it should be our Stat of the Week.

Next Monday at midday we’ll announce the winner of this week’s Stat of the Week competition, and start a new one.

The fine print:

  • Judging will be conducted by the blog moderator in liaison with staff at the Department of Statistics, The University of Auckland.
  • The judges’ decision will be final.
  • The judges can decide not to award a prize if they do not believe a suitable statistic has been posted in the preceeding week.
  • Only the first nomination of any individual example of a statistic used in the NZ media will qualify for the competition.
  • Individual posts on Stats Chat are just the opinions of their authors, who can criticise anyone who they feel deserves it, but the Stat of the Week award involves the Department of Statistics more officially. For that reason, we will not award Stat of the Week for a statistic coming from anyone at the University of Auckland outside the Statistics department. You can still nominate and discuss them, but the nomination won’t be eligible for the prize.
  • Employees (other than student employees) of the Statistics department at the University of Auckland are not eligible to win.
  • The person posting the winning entry will receive a $20 iTunes voucher.
  • The blog moderator will contact the winner via their notified email address and advise the details of the $20 iTunes voucher to that same email address.
  • The competition will commence Monday 8 August 2011 and continue until cancellation is notified on the blog.
avatar

Rachel Cunliffe is the co-director of CensusAtSchool and currently consults for the Department of Statistics. Her interests include statistical literacy, social media and blogging. See all posts by Rachel Cunliffe »

Nominations

  • avatar
    Stephen Murray

    Statistic: Headline: Worst crims out in only 11 years

    Some of the worst criminals sentenced to preventive detention are being freed after serving on average only 11 years in prison.

    BY THE NUMBERS

    Number of prisoners sentenced to preventive detention and released on parole from 2006-2012: 22

    Number of prisoners sentenced to preventive detention from 2006-2012: 96
    Source: Stuff website
    Date: 25/3/13

    I assume the figures they are presenting are correct, but the way the story is framed it overstates what is happening. The headline implies that the average sentence for preventive detention is 11 years, but in fact the average only applies to the 22 prisoners who were released, not all prisoners on preventive detention. THe average is also 11.7 years, so it would have been more correct say “Worst crims out in only 12 years”.

    In addition, the “By the numbers” section at the bottom is phrased to imply that 22 of the 96 prisoners sentenced to preventive detention between 2006 and 2012 were released. This is obviously not the case. The better statistics would have been the number of prisoners sentenced to preventive detention from the cohort of those prisoners which were released.

    Finally, there is no discussion of why the prisoners were relased – e.g. had they undergone rehabiliation and were no longer deemed a risk.

    11 years ago

  • avatar
    Kate Steel

    Statistic: Headline: “Shock poll over gay marriage bill”.

    “Public opposition to same-sex marriage has grown significantly since a law change to legalise it came before Parliament.”

    “The debate over the legislation appeared to have solidified many New Zealanders’ views, with nearly 98 per cent of those polled now prepared to take a firm stance on gay marriage.”

    Once again the New Zealand Herald infers the opinions of all New Zealanders from an internal digi poll, neglects to add caveats about the bias inherent in their sampling method. Self selection bias wrecks havoc with any conclusions they attempt to make.
    Source: New Zealand Herald Website.
    Date: 26/03/13

    The Zealand Herald uses the results of a digi-poll from it’s own website to make inferences about the opinions of ‘the New Zealand public’.

    They aren’t sampling the New Zealand public entire, they’re sampling the opinions of readers of the New Zealand Herald online.

    Moreover self selection bias probably had a significant effect on the results of the poll. You cannot conclude that New Zealanders veiws about gay marriage have solidified from poll where people elect to participate, especially since only a small, non-random percentage of the population have the opportunity to do so.

    It’s a prime example of the use of poor sampling methods to generate statistics that are not relevant to the population of interest.

    11 years ago

  • avatar
    Nikitin Sallee

    Statistic: Waikato residents are the Kiwis “least likely to iron their clothes, with 73 per cent of respondents saying they avoided it” … among other sins of omission and commission in this survey-driven article. See below.
    Source: Dominion Post
    Date: Tuesday 26 March 2013 – Page 5

    Let’s not smooth over the statistical problems in the Dominion Post’s page 5 report (26 March 2013 http://www.stuff.co.nz/dominion-post/news/8470650/Ironing-not-a-pressing-matter-for-Kiwis) on our national propensity to iron our clothes.

    The article has many wrinkles:
    – The article opens with the statement that “New Zealand is increasingly a ‘wash and wear’ nation,” but nothing in the article suggests a time series for discerning a trend, “increasing” or otherwise.
    – The survey’s sponsor (Canstar Blue) is not described, so readers lack information to judge its possible motives or biases. Canstar Blue turns out to be a “ratings company” that publishes scores “by consumers for consumers”. Its blog advises on topics like “Online dating tips: What men and women want from first encounters”, and offers the (no doubt well researched) insight that “Australian’s [sic] are no strangers to coffee addiction”.
    – “2500 Kiwis” are said to have been surveyed, but the article does not state how or by whom. Was it a telephone survey conducted by an independent professional firm? A self-selected website poll? Was the sample demographically representative? There is no sign the newspaper asked, and Canstar Blue’s website does not mention this survey at all.
    – Waikato residents are said to be the Kiwis “least likely to iron their clothes, with 73 per cent of respondents saying they avoided it”. But since no other regional ironing data is mentioned, we cannot tell by what margin Waikato folks win the most-crinkly title, nor whether the measured difference is statistically significant. Most of the figures quoted in the article suffer from similar deficiencies.

    Ironically, the article did not state, flatly or otherwise, the margin of error in the Dominion Post’s decision to devote precious reportorial resources and newsprint to this story. That’s freedom of the press for you …

    11 years ago

  • avatar
    Stephen Murray

    Statistic: Various road crash statistics for 2010-2012, e.g. “The region with the most crashes was Auckland (1132), then Canterbury (900), then Waikato (838)”
    Source: Stuff website
    Date: 27/3/13

    The overall figures appear fine (based on my experience with crash statistics). However, they only present the raw numbers, which are of limited use. For instance, the interesting part of the numbers I cited above is not that Auckland has the most crashes, which we’d expect, but that Canterbury and Waikato are not too far off.

    The time-of-day figures would be useful to examine in relation to the number of trips made within each time frame, but at least they kept the time blocks even.

    They also use the term “likely” when refering to the top 10 crash roads in New Zealand, whereas we need to know how busy those roads are to determine if they have an abnormal crash profile.

    Overall, they’re not too bad, but they could make a better story by highlighting the factors that relate to higher crash risks, not just higher crash numbers.

    11 years ago

  • avatar
    Nick Iversen

    Statistic: Dodgy percentages
    Source: New Zealand Herald
    Date: 27 March 2013

    The percentages in this article are wrong.

    The article states that “New Zealander earned 26.3 per cent …less than the average Australian.” When I do the calculation I get (61400-48600)/61400 = 20.8% less.

    What they calculated was (61400-48600)/48600 = 26.3% which is how much MORE Australians earn than NZers.

    Then they say the “gap is even more pronounced among fulltime workers, with Australians earning 24.1 per cent more.” Well 24.1 is less than 26.3 so the gap seems less pronounced to me.

    That’s two different errors regarding the terms “more” and “less.” One in regards to calculation and one in regards to definition.

    11 years ago

  • avatar
    Simon Connell

    Statistic: “How influenza is spread” infographic
    Source: NZ Herald website
    Date: 27-3-2013

    It’s really hard to follow what’s going on here.
    The infographic is titled “How influenza is spread”, but it’s not easy to tell what the graphic with the label “people infected” is showing. By itself, “People infected” could mean new people infected every 3 days, or could be the cumulative number of people infected by that date. It could mean “People infected over time by a particular individual”, or just be showing the spread of the flu.
    The way the graphic is set up makes it really hard to make a visual comparison of numbers of people infected at each time period. Normally, we would expect that the height of each column of people shows the number of people. However, in this case, it’s the number of people – or the area of the column, depending on which way you look at it. At a glance, it looks like the number of people infected drops after day 24. Actually, that’s the point where the width of the column has gone up by an extra person.
    Finally, we’re told that the number of people infected by day 3 is 1.5. This is an average, but by itself “People infected 1.5” seems rather silly.

    11 years ago