NZTA much better?
This is an expansion from the “Briefly” post about an NZTA summary of public comment on their SH1 Wellington proposals.
On Bluesky, @gwynebs had pointed out that some of the bars indicating levels of support didn’t appear to match the numbers attached to them — the “much better” category seemed inflated
A couple of days ago I noted there was a pattern to the distortion: it really was only the “much better” bar that was inflated and the other four were compressed in the same proportion. That is, some varying percentage was effectively being added to the “much better” level. This is true for all five of the specific sections of the proposal, but is not true for the two overall ratings in the middle of page 2, which appear correct. The bars are also correct in the much more detailed community engagement report; it’s just the summary that is wrong — which should indicate something about where things went wrong.
This is not rounding error. It’s much larger than that.
I went and measured the widths of all the bars in the five charts. These are in the same order as in the report: from top to bottom we have “2nd Terrace tunnel”, “Te Aro”, “Basin Reserve”, “2nd Mt Victoria tunnel”, and “Hataitai and Kilburnie”. The lower bar for each is cut from the NZTA summary. The upper bar has the correct percentages plus the necessary additional amount to make the bars line up — so the red is the amount that has been added to the “much better” category in the graph compared to the numbers. My bars and their bars don’t line up perfectly; that is probably rounding error. One possible explanation is that the red is some sort of “Don’t know” value that has inadvertently been put into the last bar — I could see that happening if the bars were drawn as pictures rather than as charts.
How much should we care about this? On the one hand, this sort of thing is probably corrosive to public trust in government data. On the other hand, this purports to be quantitative analysis of a self-selecting survey of the sort that attracts highly motivated and unrepresentative minorities*, so there’s a real limit to how seriously you should be taking the numbers.
Arguably, the point of this sort of survey is to see if there are surprising results — either something NZTA didn’t know about, or stronger opposition than they expected. Even so, most people who aren’t the Advertising Standards Authority would think there’s something wrong with graphs that don’t match the data they purport to present.
*eg, people such as me



Recent comments