The Spreading of Misinformation Online

Trailspotter

Senior Member.
A scientific paper of the above title published yesterday in PNAS Early Edition may be of interest to Metabunk members:
http://www.pnas.org/content/early/2016/01/02/1517441113.full.pdf

Significance

The wide availability of user-provided content in online social media facilitates the aggregation of people around common interests, worldviews, and narratives. However, the World Wide Web is a fruitful environment for the massive diffusion of unverified rumors. In this work, using a massive quantitative analysis of Facebook, we show that information related to distinct narratives––conspiracy theories and scientific news–– generates homogeneous and polarized communities (i.e., echo chambers) having similar information consumption patterns. Then, we derive a data-driven percolation model of rumor spreading that demonstrates that homogeneity and polarization are the main determinants for predicting cascades’ size.

Abstract

The wide availability of user-provided content in online social media facilitates the aggregation of people around common interests, worldviews, and narratives. However, the World Wide Web (WWW) also allows for the rapid dissemination of unsubstantiated rumors and conspiracy theories that often elicit rapid, large, but naive social responses such as the recent case of Jade Helm 15––where a simple military exercise turned out to be perceived as the beginning of a new civil war in the United States. In this work, we address the determinants governing misinformation spreading through a thorough quantitative analysis. In particular, we focus on how Facebook users consume information related to two distinct narratives: scientific and conspiracy news. We find that, although consumers of scientific and conspiracy stories present similar consumption patterns with respect to content, cascade dynamics differ. Selective exposure to content is the primary driver of content diffusion and generates the formation of homogeneous clusters, i.e., “echo chambers.” Indeed, homogeneity appears to be the primary driver for the diffusion of contents and each echo chamber has its own cascade dynamics. Finally, we introduce a data-driven percolation model mimicking rumor spreading and we show that homogeneity and polarization are the main determinants for predicting cascades’ size.

Freely available online through the PNAS open access option.
Content from External Source
This article contains supporting information online:
Supporting Information Appendix: Echo chambers in the age of misinformation
 
Last edited by a moderator:
Very interesting. Not sure exactly what this means however. There is a similar dissemination pattern for science and conspiracy but a dissimilar acceptance or belief pattern for science versus conspiracy?
 
Last edited by a moderator:
Very interesting. Not sure exactly what this means however. There is a similar dissemination pattern for science and conspiracy but a dissimilar acceptance or belief pattern for science versus conspiracy?

If I'm reading the article correctly, science quickly reaches a broader audience (greater diffusion) but interest and cascade lifetime are poorly correlated. Conspiracy diffuses more slowly, but interest continues to increase with time.
 
If I'm reading the article correctly, science quickly reaches a broader audience (greater diffusion) but interest and cascade lifetime are poorly correlated. Conspiracy diffuses more slowly, but interest continues to increase with time.
Quite interesting. Thanks!
 
Came across an interesting Washington Post article about [this study] conducted to determine how misinformation spreads across the internet.

Figuring out how such ideas diffuse through social media may be key to scientists and science communicators alike as they look for ways to better reach the public and change the minds of those who reject their information. A study published Monday in Proceedings of the National Academy of Sciences sheds new light on the factors that influence the spread of misinformation online.
Content from External Source
The researchers conclude that the diffusion of content generally takes place within clusters of users known as “echo chambers” — polarized communities that tend to consume the same types of information. For instance, a person who shares a conspiracy theory online is typically connected to a network of other users who also tend to consume and share the same types of conspiracy theories. This structure tends to keep the same ideas circulating within communities of people who already subscribe to them, a phenomenon that both reinforces the worldview within the community and makes members more resistant to information that doesn’t fit with their beliefs.
Content from External Source
https://www.washingtonpost.com/news...s-climate-doubt-spreads-through-social-media/

*Edit
Thanks for merging this Mick. I should have searched to see if there was already a thread.
 
Last edited:
YouTube is a big problem, there's these fringe groups that post ufo hoaxes for instance or false claims over the Apollo moon landings or 911 and [some people] gobble it up and spread those false videos and ideals all over social media sites like facebook...

I had a lengthy debate last year with an otherwise intelligent woman about the moon landings, she saw a couple videos on YouTube and became convinced the moon landings were faked...In the end after listening to and reading all the evidence I presented her, she resolved to accept the ridiculous youtube videos saying seeing was believing and we never went to the moon...
 
Last edited by a moderator:
...In the end after listening to and reading all the evidence I presented her, she resolved to accept the ridiculous youtube videos saying seeing was believing and we never went to the moon...

In the center of my avatar image, the little white speck casting a shadow is the bottom half of the Apollo 11 lunar module, photographed from lunar orbit in 2009. Of course, "seeing is believing" generally applies only to things that the seer already believes, so this won't convince anyone.

Back to the original topic, I've been heavily involved in a few online forums for many years, and the "echo chamber" effect is very real. Even if a community starts out with a range of opinions, it frequently happens that a small group of true believers will eventually annoy the heck out of everyone else, and drive them off. Forum moderators can encourage or discourage this, but never have complete control.

I don't think Facebook has improved the situation any -- it just makes it even easier to shut out ideas and people that you disagree with.

I find this discouraging. On the one hand, the Internet has become an amazing way for people to find friends and to learn things, with no artificial limitations imposed by physical distance. And on the other hand, so many of us use it solely to find clones of ourselves and echoes of what we already think.
 


This video from (the incredibly great) CPG Grey channel is a good dissection of how and why ideas such as CTs and misinformation spread across social media.
 
Saw this article the other day (ironically shared on my fb) which gives some excellent examples behind the algorithms that fb and other soc media uses to populate feeds.

. After an anonymous source alleged that Facebook's Trending News algorithm (and human staff) was intentionally hiding conservative news from the social network, all hell broke loose. Congressional hearings have been called. Whether the reports are right—and whether hearings are justified—underneath the uproar is a largely unspoken truth: The algorithms that drive social networks are shifting the reality of our political systems—and not for the better.
Content from External Source
. First it is important to understand the technology that drives the system. Most algorithms work simply: Web companies try to tailor their content (which includes news and search results) to match the tastes and interests of readers. However as online organizer and author Eli Pariser says in the TED Talk where the idea of the filter bubble became popularized: "There's a dangerous unintended consequence. We get trapped in a ‘filter bubble' and don't get exposed to information that could challenge or broaden our worldview."
Content from External Source
http://www.fastcoexist.com/3059742/...rting-reality-by-boosting-conspiracy-theories
 
The last few posts on the 'channeling' effect of social media and on increasing ability of people to filter out things they don't agree with bought to my mind a New York Times article that discussed the rise of "Safe Spaces" at universities and implies that people are spreading this kind of reality filtering offline.

The safe space, Ms. Byron explained, was intended to give people who might find comments “troubling” or “triggering,” a place to recuperate. The room was equipped with cookies, coloring books, bubbles, Play-Doh, calming music, pillows, blankets and a video of frolicking puppies, as well as students and staff members trained to deal with trauma. Emma Hall, a junior, rape survivor and “sexual assault peer educator” who helped set up the room and worked in it during the debate, estimates that a couple of dozen people used it. At one point she went to the lecture hall — it was packed — but after a while, she had to return to the safe space. “I was feeling bombarded by a lot of viewpoints that really go against my dearly and closely held beliefs,” Ms. Hall said.
Content from External Source
http://www.nytimes.com/2015/03/22/opinion/sunday/judith-shulevitz-hiding-from-scary-ideas.html?_r=0
 
Google is becoming less useful these days to find factual information.
It's much easier to find "less-factual" information on Google.
The relative ease these days, of creating a blog, site, forum....filled with opinions or promotion....puts keywords into Google that (if trending enough) are nearing the top of search results.

(side note)
I do not hazard-to-guess, that most people are not savvy enough, or simply not experienced enough.....to do thorough and specific Google Searches. It's a clever art.
From my experience, most people don't have any idea what "search operators" are.
 
That's how I ended up here. I was challenged to"Google Nexrad"

He obviously just stopped at the first 6 YouTube videos that came up.
 
Here's an article from HuffPost on how Youtube uses algorithms that suck people down the conspiracy theory rabbit hole. It describes "5 of the wildest conspiracy theories YouTube promoted in 2018". I have a family member that will mention "watching the news", and I know that it means watching these types of videos for hours a day. :(

5 Of The Wildest Conspiracy Theories YouTube Promoted In 2018 Once you fall down YouTube’s conspiracy theory rabbit hole, its algorithm will continue feeding you disinformation.
 
The first part of the following study examines in detail how Russian media outlets played a part in introducing and spreading misinformation about the Skripal incident online. I found it interesting that these conspiracy theory mechanisms we all know can apparently be employed as part of a goverment media and foreign policy strategy.

https://www.kcl.ac.uk/policy-institute/assets/weaponising-news.pdf
Weaponizing News
RT, Sputnik, and targeted disinformation

About the Policy Institute at King’s College London
The Policy Institute at King’s College London aims to solve society’s challenges with evidence and expertise, by combining the rigour of academia with the agility of a consultancy and the connectedness of a think tank. Our defining characteristic is our multidisciplinary and multi-method approach, drawing on the wide range of skills in our team and the huge resource in King’s and its wider network.
About the Centre for the Study of Media, Communication & Power
The Centre for the Study of Media, Communication & Power explores how news provision, political communication, public discourse, civic engagement and media power are changing in the digital age. We do this through rigorous empirical research, and communication of the ndings of this research to inform relevant academic and public policy debates and civic society responses, in order to help promote diversity, fairness, transparency and accountability in media and communication.
About the authors
Dr Gordon Ramsay has been conducting and publishing media and communication research for the past decade. He holds a PhD in Political Communication from the University of Glasgow (2011) and and is the co-author, with Dr Martin Moore, of UK Media Coverage of the 2016 EU Referendum Campaign and Monopolising Local News. He has co-developed the content analysis research tool Steno with the developer Ben Campbell, and has previously published research on media regulation and policy at the Media Standards Trust, the University of Westminster, and Cardi University. Email: g.ramsay@westminster.ac.uk
Dr Sam Robertshaw has a PhD in Politics from the University of Glasgow. He has conducted research on society-military relations in the UK and in Russia. His research has been published in Global Affairs and European Security.
[..]
This project was funded by the Open Society Foundation.
Content from External Source
From the executive summary (p.6 f.):

RT, Sputnik and Russian framing of the Skripal incident
  • Coverage of the Skripal incident on RT and Sputnik was abundant, with 735 articles on the story published across both sites over the four weeks following the discovery of the poisoning.
  • In total, 138 separate – and often contradictory – narratives explaining the incident and its aftermath were published by RT and Sputnik during this period, ranging from interpretations of Western motives, to explanations of the origins of the nerve agent used in the poisoning, to full-blown conspiracy theories.
  • Narratives often appeared following public interventions by Western governments. Following Theresa May’s speech to the UK Parliament on 12th March in which Russia was accused and the nerve agent ‘Novichok’ identified, a flurry of narratives contesting the origins and existence of Novichok appeared on RT and Sputnik, and narratives framing the incident as defined by geopolitics and Western domestic political problems began to emerge.
  • The most frequently repeated narratives supporting the Russian position asserted that Russia’s willingness to cooperate was being rejected by the West, that there was no evidence to prove Russian guilt, and that the Western response was driven by ‘Russophobia’ and hysteria. Theresa May’s accusation of Russian guilt was frequently cited, but often immediately rebutted by editorial statements by RT or Sputnik.
  • 215 separate sources were quoted as providing one or more narratives on the Skripal incident and its aftermath. Many of these were non-governmental Russian and ‘expert’ non-Russian sources, columnists and fringe or right-wing politicians who formed a ‘parallel commentariat’ supplying pro-Russian or anti-West interpretations of events.
  • Elite Russian government sources – individual and institutional – were prominent in coverage on RT and Sputnik and supplied a range of narratives ranging from the conciliatory to the conspiratorial. Their elite status and use of combative and confrontational language towards Western counterparts resulted in substantial coverage by mainstream UK media. This was the most successful means by which pro-Russian narratives were inserted into Western news content.
Content from External Source
Pages 23 to 45 have the details on this topic. Other sections of the study present the phenomenon of "churnalism", the methodology used in the study, and results on UK media echoing these Russian narratives, and how pushing "government dysfunction" is part of their strategy. I'm reading that as promoting government distrust, which plays a huge role in many conspiracy theories.
 
Back
Top