Posts filed under Social Media (50)

November 26, 2014

What doesn’t get into the papers

I complain a lot about the publicity-based surveys of varying quality that make it into the NZ media, but there’s a lot more that gets filtered out.

A journalist (who I’m not sure if I should name) sent me an example from Mitre 10

The research surveyed more than 1,500 New Zealanders on their connection to the quarter-acre dream and asked their opinions on the size of back yards and what they were doing to make the most of them.

An overwhelming 84 per cent of respondents agreed that they liked the idea of the traditional Kiwi quarter-acre paradise – a large plot of land with a standalone house on it, with plenty of room outdoors, and almost all said they would rather live on the traditional quarter-acre section than in high-density housing with reduced outdoor living spaces.

Over half of respondents felt that their outdoor living space is smaller now than what they had growing up (53%). Fifty percent of respondents attributed this to sections of land getting smaller, while 35 per cent believe houses are getting bigger, so there’s less room on a section for an outdoor living space.

The press release is a well-crafted example, with supporting evidence from QV that house sizes are increasing and quotes from a Massey University researcher — not about the survey, but about the general topic.

The survey, on the other hand, was fairly bogus. It was online, and most of the respondents got there through the Mitre 10 Facebook page.  You’d expect (and the Mitre 10 CEO has said) that the Facebook page attracts Mitre 10 customers, not necessarily a representative sample.  The report confirms this, with 88% of respondents being born in NZ, compared to about 75% of the population as a whole.

To make matters worse, here’s the reported data for the paragraphs quoted above. “Houses are bigger” and “sections are smaller” were alternative responses to the same question. You couldn’t answer that both were true — the correct answer, and the position that the report itself is pushing.

Untitled

 

One more finding I can’t resist quoting: “The majority of Kiwis (24%) have spent between $1,000 and $5,000 on their outdoor living spaces over the past year. “

Untitled 2

November 16, 2014

John Oliver on the lottery

When statisticians get quoted on the lottery it’s pretty boring, even if we can stop ourselves mentioning the Optional Stopping Theorem.

This week, though, John Oliver took on the US state lotteries: “..,more than Americans spent on movie tickets, music, porn, the NFL, Major League Baseball, and video games combined. “

(you might also look at David Fisher’s Herald stories on the lottery)

July 23, 2014

The self-surveillance world

See anyone you know? (click to embiggen)

cats

 

This is a screenshot from I know where your cat lives, a project at Florida State University that is intended to illustrate the amount of detailed information available from location-tagged online photographs, without being too creepy — just creepy enough.

(via Robert Kosara and Keith Ng)

July 1, 2014

Facebook recap

The discussion over the Facebook experiment seems to involve a lot of people being honestly surprised that other people feel differently.

One interesting correlation based on my Twitter feed is that scientists involved in human subjects research were disturbed by the research and those not involved in human subjects research were not. This suggests our indoctrination in research ethics has some impact, but doesn’t answer the question of who is right.

Some links that cover most of the issues

June 29, 2014

Ask first

Via The Atlantic, there’s a new paper in PNAS (open access) that I’m sure is going to be a widely cited example by people teaching research ethics, and not in a good way:

 In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.

More than 650,000 people had their Facebook feeds meddled with in this way, and as that paragraph from the abstract makes clear, it made a difference.

The problem is consent.  There is a clear ethical principle that experiments on humans require consent, except in a few specific situations, and that the consent has to be specific and informed. It’s not that uncommon in psychological experiments for some details of the experiment to be kept hidden to avoid bias, but participants still should be given a clear idea of possible risks and benefits and a general idea of what’s going on. Even in medical research, where clinical trials are comparing two real treatments for which the best choice isn’t known, there are very few exceptions to consent (I’ve written about some of them elsewhere).

The need for consent is especially clear in cases where the research is expected to cause harm. In this example, the Facebook researchers expected in advance that their intervention would have real effects on people’s emotions; that it would do actual harm, even if the harm was (hopefully) minor and transient.

Facebook had its research reviewed by an Institutional Review Board (the US equivalent of our Ethics Committees), and the terms of service say they can use your data for research purposes, so they are probably within the law.  The psychologist who edited the study for PNAS said

“I was concerned,” Fiske told The Atlantic, “until I queried the authors and they said their local institutional review board had approved it—and apparently on the grounds that Facebook apparently manipulates people’s News Feeds all the time.”

Fiske added that she didn’t want the “the originality of the research” to be lost, but called the experiment “an open ethical question.”

To me, the only open ethical question is whether people believed their agreement to the Facebook Terms of Service allowed this sort of thing. This could be settled empirically, by a suitably-designed survey. I’m betting the answer is “No.” Or, quite likely, “Hell, no!”.

[Update: Story in the Herald]

June 9, 2014

Chasing factoids

The Herald says

Almost four in 10 young UK adults describe themselves as digital addicts, according to research published by Foresters, the financial services company.

The story does quote an independent expert who is relatively unimpressed with the definition of ‘digital addict’, but it doesn’t answer the question ‘what sort of research?”

Via Google, I found a press release of a digital addiction survey promoted by Foresters. It’s not clear if the current story is based on a new press release from this survey or a new version of the survey, but the methodology is presumably similar.

So, what is the methodology?

Over 1,100 people across the UK responded to an online survey in November 2013 , conducted by Wriglesworth Research

There’s also a related press release from Wriglesworth, but without any more methodological detail. If I Google for “wriglesworth survey”, this is what comes up

wriglesworth

That is, the company is at least in the habit of conducting self-selected online polls, advertised on web forums and Twitter.

I tried, but I couldn’t find any evidence that the numbers in this online survey were worth the paper they aren’t written on.

May 22, 2014

Big Data social context

From Cathy O’Neil: Ignore data, focus on power (and, well, most of the stuff on her blog)

From danah boyd and Kate Crawford: Critical Questions for Big Data

Will large-scale search data help us create better tools, services, and public goods? Or will it usher in a new wave of privacy incursions and invasive marketing? Will data ana- lytics help us understand online communities and political movements? Or will it be used to track protesters and suppress speech? Will it transform how we study human communication and culture, or narrow the palette of research options and alter what ‘research’ means?  

 

May 14, 2014

One of the things social media is good for

[Update: 538 now has an intro to the story explaining the mistakes and apologising. Good for them.]

So, at  fivethirtyeight.com there’s this story on mapping kidnappings in Nigeria using data from GDELT, the sort of thing data journalism is supposed to be good at. GDELT automatically extracts information from news stories to build a huge global database.

On Twitter, Erin Simpson, whose about.me page says she is “a leading specialist in the intersection of intelligence, data analysis, irregular warfare, and illicit systems – with an emphasis on novel research designs,” — and who has worked on the GDELT parser — is Not Happy.

Thanks to Storify, here are three summaries of what she says, but a lot of it can be boiled down to one point:

In conclusion: VALIDATE YOUR FREAKING DATA. It’s not true just because it’s on a goddamn map.

(via @LewSOS)

May 3, 2014

White House report: ‘Big Data’

There’s a new report “Big Data: Seizing Opportunities, Preserving Values” from the Office of the President (of the USA).  Here’s part of the conclusion (there are detailed recommendations as well)

Big data tools offer astonishing and powerful opportunities to unlock previously inaccessible insights from new and existing data sets. Big data can fuel developments and discoveries in health care and education, in agriculture and energy use, and in how businesses organize their supply chains and monitor their equipment. Big data holds the potential to streamline the provision of public services, increase the efficient use of taxpayer dollars at every level of government, and substantially strengthen national security. The promise of big data requires government data be viewed as a national resource and be responsibly made available to those who can derive social value from it. It also presents the opportunity to shape the next generation of computational tools and technologies that will in turn drive further innovation.

Big data also introduces many quandaries. By their very nature, many of the sensor technologies deployed on our phones and in our homes, offices, and on lampposts and rooftops across our cities are collecting more and more information. Continuing advances in analytics provide incentives to collect as much data as possible not only for today’s uses but also for potential later uses. Technologically speaking, this is driving data collection to become functionally ubiquitous and permanent, allowing the digital traces we leave behind to be collected, analyzed, and assembled to reveal a surprising number of things about ourselves and our lives. These developments challenge longstanding notions of privacy and raise questions about the “notice and consent” framework, by which a user gives initial permission for their data to be collected. But these trends need not prevent creating ways for people to participate in the treatment and management of their information.

You can also read comments on the report by danah boyd, and the conference report and videos from her conference’The Social, Cultural & Ethical Dimensions of “Big Data”‘ are now online.

March 26, 2014

Are web-based student drinking interventions worthwhile?

Heavy drinking and the societal harm it causes is a big issue and attracts a lot of media and scholarly attention (and Statschat’s, too). So we were interested to see today’s new release from the Journal of the American Medical Association. It describes a double-blind, parallel-group, individually-randomised trial that studied moderate to heavy student drinkers from seven of our eight universities to see if a web-based alcohol screening and intervention programme reduced their unhealthy drinking behaviour.

And the short answer? Not really. But if they identified as Māori, the answer was … yes, with a caveat. More on that in a moment.

Statistician Nicholas Horton and colleagues used an online questionnaire to identify students at Otago, Auckland, Canterbury, Victoria, Lincoln, Massey, and Waikato who had unhealthy drinking habits. Half the students were assigned at random to receive personalised feedback and the other students had no input. Five months later, researchers followed up with the students on certain aspects of their drinking.

The overall result? “The intervention group tended to have less drinking and fewer problems then the control group, but the effects were relatively modest,” says Professor Horton. The take-away message: A web-based alcohol screening and intervention program had little effect on unhealthy drinking among New Zealand uni students. Restrictions on alcohol availability and promotion are still needed if we really want to tackle alcohol abuse.

But among Māori students, who comprise 10% of our national uni population, those receiving intervention were found to drink 22% less alcohol and to experience 19% fewer alcohol-related academic problems at the five-month follow-up. The paper suggests that Māori students are possibly more heavily influenced by social-norm feedback than non-Māori students. “Māori students may have a stronger group identity, enhanced by being a small minority in the university setting.” But the paper warns that the difference could also be due to chance, “underscoring the need to undertake replication and further studies evaluating web-based alcohol screening and brief intervention in full-scale effectiveness trials.”

The paper is here. Read the JAMA editorial here.