January 17, 2014

Hard-to-survey populations

As we saw a few weeks ago with the unexpectedly high frequency of virgin births, it can be hard to measure rare things accurately in surveys, because rare errors will crowd out the true reports. It’s worse with teenagers, as a new paper from the Add Health study has reported. The paper is not open-access, but there’s a story in MedicalXpress.

So imagine the surprise and confusion when subsequent revisits to the same research subjects found more than 70 percent of the self-reported adolescent nonheterosexuals had somehow gone “straight” as older teens and young adults.

“We should have known something was amiss,” says Savin-Williams. “One clue was that most of the kids who first claimed to have artificial limbs (in the physical-health assessment) miraculously regrew arms and legs when researchers came back to interview them.”

This wasn’t just data-entry error, and it probably wasn’t a real change; after some careful analysis they conclude it was a mixture of genuine misunderstanding of the question (“romantic” vs “sexual” attraction), and, well, teenagers. Since not many teens are gay (a few percent), it doesn’t take many incorrect answers to swamp the real data.

It doesn’t matter so much that the number of gay and bisexual teenagers was overestimated. The real impact is on the analysis of health and social development in this population.  In Add Health and at least one other youth survey, according to the researchers, this sort of error has probably led to overestimating the mental and physical health problems of non-heterosexual teenagers.

avatar

Thomas Lumley (@tslumley) is Professor of Biostatistics at the University of Auckland. His research interests include semiparametric models, survey sampling, statistical computing, foundations of statistics, and whatever methodological problems his medical collaborators come up with. He also blogs at Biased and Inefficient See all posts by Thomas Lumley »

Comments

  • avatar
    megan pledger

    When someone says that the respondents “misunderstood” the question, it usually means the question writers were the ones with the misunderstanding.

    10 years ago

    • avatar
      Thomas Lumley

      I don’t think there are any objective facts as to who has a misunderstanding — the two sides thought the question meant different things.

      However, it looks as though that wasn’t the only issue in this case.

      10 years ago

      • avatar
        megan pledger

        The question writers are paid to make sure that the questions have the right meaning to the target audience.

        Although to be fair, sometimes really bad questions have to be included in order to compare against some other surveys.

        10 years ago