March 6, 2017

Cause of death

In medical research we distinguish ‘hard’ outcomes that can be reliably and objectively measured (such as death, blood pressure, activated protein C concentrations) from ‘soft’ outcomes that depend on subjective reporting.  We also distinguish ‘patient-centered’ or ‘clinical’ or ‘real’ outcomes that matter directly to patients (such as death, pain, or dependency) from ‘surrogate’  or ‘intermediate’ outcomes that are biologically meaningful but don’t directly matter to patients.  ‘Death’ is one of the few things we can measure that’s on both lists.

Cause of death, however, is a much less ideal thing to measure.  If some form of cancer screening makes it less likely that you die of that particular type of cancer but doesn’t increase how long you live, it’s going to be less popular than if it genuinely postpones death.  What’s more surprising is that cause of death is hard to measure objectively and reliably. But it is.

Suppose someone smokes heavily for many years and as a result develops chronic lung disease, and as a result develops pneumonia, and as a result is in hospital, has a cardiac arrest due to a medical error, and dies. Is the cause of death ‘cardiac arrest’ or ‘medical error’ or ‘pneumonia’ or ‘COPD’ or ‘smoking’?  The best choice is a subjective matter of convention: what’s the most useful way to record the primary cause of death? But even with a convention in place, there’s a lot of work to make sure it is followed.  For example, a series of research papers in Thailand estimated that more than half the deaths from the main causes (eg stroke, HIV/AIDs, road traffic accidents, types of heart disease) were misclassified as less-specific causes, and came up with a way to at least correct the national totals.

In Western countries, things are better on average. However, as Radio NZ described today, in Australia (and probably NZ) deaths of people with intellectual disability tend to be reported as due to their intellectual disability rather than to whatever specific illness or injury they had.  You can see why this happens, but you should also be able to see why it’s not ideal in improving healthcare for these people.  Listen to the Radio NZ story; it’s good. If you want a reference to the open-access research paper, though, you won’t find it at Radio NZ. It’s here

 

avatar

Thomas Lumley (@tslumley) is Professor of Biostatistics at the University of Auckland. His research interests include semiparametric models, survey sampling, statistical computing, foundations of statistics, and whatever methodological problems his medical collaborators come up with. He also blogs at Biased and Inefficient See all posts by Thomas Lumley »

Comments

  • avatar
    steve curtis

    Sometimes its the other way round, some ongoing but minor cause seems to get listed as the actual cause.
    eg Air pollution. No doubt peoples lives are shortened by whats in the air they breathe, but the only statistical measure around seems to be reported as actual deaths. Im sure none of those people have that diagnosis listed as cause of death as the numbers are modelled. Numbers of deaths may have useful to show various effects of living near power stations or motorways but the real effect is shortening of lives. But deaths is sexy in headlines when something like 1 year increments for shortening of life is more realistic

    7 years ago

    • avatar
      Thomas Lumley

      Those aren’t reported cause of death, they are modelled from variations in death rates.

      I think the estimates of short-term deaths and hospitalisations from air pollution are reasonably solid, because there aren’t a lot of confounding variables available.

      The estimates for long-term effects are much harder to pin down. David Spiegelhalter has a nice discussion of this issue that I linked in a recent “Briefly” post.

      7 years ago