May 4, 2015

On algorithmic transparency

An important emerging area of statistics is algorithmic transparency: what information is your black-box analytics system really relying on, and should it?

From Matt Levine

The materiality standard that controls so much of securities law comes from an earlier, simpler time; a time when reasonable people could look at a piece of information and say “oh, yes, of course that will move the stock up” (or down), and if they couldn’t then they wouldn’t bother with it. Modern financial markets are not so intuitive: Algorithms are interested in information that reasonable humans cannot process, with the result that reasonable humans can’t always predict how significant any piece of information is. That’s a world that is more complicated for investors, but it also seems to me to be more complicated for insider trading regulation. And I’m not sure that regulation has really kept up.

 

avatar

Thomas Lumley (@tslumley) is Professor of Biostatistics at the University of Auckland. His research interests include semiparametric models, survey sampling, statistical computing, foundations of statistics, and whatever methodological problems his medical collaborators come up with. He also blogs at Biased and Inefficient See all posts by Thomas Lumley »