News Culture Comment Video

Feature

Popping the filter bubble: Why finding the truth online isn't easy

Wednesday 13th December 2017

If you're searching for vaccine info on the internet, there are a few things you need to know.

 

 

Illustration: Rachel Neser

In 2014, an Australian computer scientist submitted a paper to an academic journal titled “Get me off Your Fucking Mailing List”. To his surprise, it was accepted.

The fake paper was originally written by American researchers and consisted of the title’s seven words repeated over and over again. It also included helpful diagrams.

Computer scientist Dr Peter Vamplew was fed up with emails from dodgy academic journals filling his inbox and so he sent the paper to the International Journal of Advanced Computer Technology hoping his message would get across nice and clear.

“They told me to add some more recent references and do a bit of reformatting,” he told The Guardian. “But otherwise they said its suitability for the journal was excellent.”

Vamplew was required to pay a $150 fee to have the paper published, which he didn’t do.

Vamplew isn’t the only one who found a way to troll money-focused journals; more than 10 years ago, three MIT students built software that generates bogus research papers and last year, a New Zealand professor successfully submitted a paper to a nuclear physics conference written entirely by the iOS autocomplete function.

The online world has become a dense jungle of information. Between echo chambers, filter bubbles and dodgy research, it’s easy to get tripped up by seemingly valid studies or fake news on Facebook. When making important decisions- like whether or not to vaccinate - finding the truth is vital so how do you avoid the traps and get to the real stuff? 

Dr Adam Stevens, an astrophysicist at The University of Western Australia, says one challenge is distinguishing shoddy research from actual studies. And the confusion is the result of an unregulated market.

“The question is a case of ‘who watches the Watchmen?’ Unfortunately, research publication has become a money-making business for many. This incentivises some journals to publish whatever they can and charge the author,” he says.

While completely bogus papers may be easy to spot, it’s more difficult pick out journals that are publishing research with no peer reviewing or editing.

“It's pretty hard to make a judgment as a non-expert. While impact factor is one metric, some papers can be cited for being bad examples or intentionally controversial,” says Dr Stevens.

The growing number of “open access” journals – where the author pays a fee to the journal and then the material is free for everyone to access – is one of the reasons for the proliferation of bad science.

It’s estimated that up to 10 percent of open-access journals are exploiting the model by charging a fee to proofread, peer-review and edit a research paper without actually carrying out the work.

It’s a complex problem though. Dr Siouxsie Wiles, senior lecturer at the University of Auckland says despite the issues, open access is an important shift from the conventional model where companies charge institutions huge amounts of money to access research.

“Open access is a legitimate response to the fact that the publication of scholarly work is a very lucrative business. Academic publishers like Elsevier make more profit than companies like Apple!”

“Predatory journals are heart-breaking because open access is part of the solution to a really serious problem – the stranglehold big publishing houses have on scholarly literature.”

Nevertheless, predatory journals have opened up the floodgates to poorly carried-out studies about vaccinations that continue to fuel the anti-vax movement.

“It’s provided a platform where people can publish bad science without it being challenged, and have it then appear in the scholarly literature as though it’s legitimate,” she says.

Dr Wiles says it’s hard to figure out what is good versus bad science without looking at the actual data and even then, sometimes it’s impossible to tell without trying to repeat the study.

“A good example of that is some papers published by Andrew Wakefield claiming to prove that they had found genetic evidence of the measles vaccine in the guts of children with autism but not children without autism.”

The original study was published in 1998 in the journal the Lancet and involved 12 patients. She says if you read the papers, the data looks valid. But when researchers tried to repeat the study, they couldn't. There were major flaws.

Wakefield was struck off the medical register after his paper and practices were found to be fraudulent. The link between vaccines and measles has been widely and repeatedly debunked but serious damage was done by Wakefield’s paper; vaccination rates plummeted, leading to outbreaks of measles.

But many won’t make their minds up about vaccinating kids by reading the research in journals; instead, they might search Google.

“Search engine algorithms mean that if you have a history of reading anti-vax stuff, that’s what Google will serve you more of. We all see a different internet,” says Dr Wiles.

Social media networks work in a similar way. Once people start engaging with anti-vax material, algorithms will start to understand that is the sort of material you are engaging with and serve you up more of it.

On top of that, since people often organise themselves into groups of people who share their opinions, it's easy to get stuck in one-sided bubbles of information.

“People only see the anti-vax stuff, which then makes them think that is all there is,” says Dr Wiles.

A 2011 study looking at vaccination sentiments on Twitter found that whether you’re pro or anti-vaccination, your social media accounts can act as an “echo chamber” where “personal opinions that affect individual medical decisions are predominately reaffirmed by others.”

The authors noted that clustering of anti-vaccination sentiment could let to clusters of lower vaccine administration and more disease outbreaks.

It’s not just a Twitter problem. Dr Giordano Pérez Gaxiola, pediatrician and director of the Evidence-based Medicine Department in a children’s hospital in México, has looked into the phenomenon of “filter bubbles” on Facebook.

Filter bubbles is a term referring to the personalised results you get when searching for information on a website.

“Facebook ‘learns’ about your interests with each click you make liking someone's post or a shared article, and uses this information to make your timeline more personal,” he says.

Dr Pérez Gaxiola had been noticing an increase in parents refusing to vaccinate their children and wanted to know where they were getting their information from.

“At least once a month I encounter a child who doesn't have any vaccinations. And since I'm fairly active in social media, I noticed quite a bit of anti-vaccination posts, especially in Facebook,” he says.

He set up a new Facebook account and ran a search for “vaccine harms”. Groups and pages were suggested even as he typed.

“We visited the first results and clicked "like" in every first suggestion. Within three clicks we had identified three anti-vaccine pages with over 125,000 "likes" and our timeline was immediately populated by their posts.

After "liking" the first post that appeared in the timeline, Dr Pérez Gaxiola saw more suggestions about anti-vaccine pages, all with more than 10,000 followers.

“The obvious observation was that this social network can create an ideological bubble and feeds you the information you will probably like.”

He explains that the problem is that it can foster confirmation bias and steer people to make the wrong decisions about their health.

“Take the example of the myth about the measles vaccine and autism. This claim has been disproven in multiple studies with thousands of children. But if a person keeps seeing posts from friends or loved ones claiming that it is true the belief gets reinforced. Anecdotes are powerful tools of persuasion.”

In a follow-up study, Dr Pérez Gaxiola and his team looked at risk factors for vaccine refusal in kindergartens and found an association between online searching for information about vaccines and negative attitudes toward vaccination.  

“I do not think there is a way to avoid filter bubbles. But there is a way to be more careful about the things we share on Facebook and other social media,” he says.

After the “filter bubbles” study, his team launched a campaign called "Comparte Con Cuidado" or “Share with Care” in English. They encouraged everyone to follow four steps before posting or sharing articles about vaccines: Critically appraise, Check other sources, Write your own conclusions and then share.

Dr Siouxsie Wiles says it's an incredibly difficult task to find the truth on the internet. She says it's important to know about how algorithms work so that people are aware that their research will always be biased.

“I think understanding that is key to understanding how hard it really is to properly ‘do your research’.”



Join the discussion »

“This AUT initiative is relevant: http://sr-indonesia.com/web-exclusives/view/ublish-then-perish” — Duncan Graham


Login to post a comment

Login or Signup


Comment

In accordance with our Comments Policy, all comments are moderated before they appear on the site. This happens 7am to 7pm each weekday.

Mava is an award-winning journalist for The Wireless.
Join the discussion

Discuss, comment and read comments about this article.

Share