September 25, 2013

Briefly

  • Big Data and Due Process: fairly readable academic paper arguing for legal protections against harm done by automated classification (accurate or inaccurate)
  • The Herald quotes Maurice Williamson on a drug seizure operation

“The harm prevented from keeping these analogues away from communities has been calculated at $32 million,” Mr Williamson said.

Back in 2008, Russell Brown explained where these numbers come from. As you might expect, there is no reasonable sense in which they are estimates of harm prevented. They don’t measure what communities should care about.

  • Levels of statistical evidence are ending up in the US Supreme court. At issue is whether  a press release claiming that a treatment”Reduces Mortality by 70% in Patients with Mild to Moderate Disease” is fraud when the study wasn’t set up to look at mortality and when the reduction wasn’t statistically significant by usual standards.  Since a subsequent trial designed to look at mortality reductions convincingly failed to find them, the conclusion implied by the press release title is untrue, but the legal argument is whether, at the time, it was fraud.
  • From New Scientist: is ‘personalised’ medicine actually bad for public health?

 

avatar

Thomas Lumley (@tslumley) is Professor of Biostatistics at the University of Auckland. His research interests include semiparametric models, survey sampling, statistical computing, foundations of statistics, and whatever methodological problems his medical collaborators come up with. He also blogs at Biased and Inefficient See all posts by Thomas Lumley »