ترغب بنشر مسار تعليمي؟ اضغط هنا

Echo chambers in the age of misinformation

100   0   0.0 ( 0 )
 نشر من قبل Walter Quattrociocchi
 تاريخ النشر 2015
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

The wide availability of user-provided content in online social media facilitates the aggregation of people around common interests, worldviews, and narratives. Despite the enthusiastic rhetoric on the part of some that this process generates collective intelligence, the WWW also allows the rapid dissemination of unsubstantiated conspiracy theories that often elicite rapid, large, but naive social responses such as the recent case of Jade Helm 15 -- where a simple military exercise turned out to be perceived as the beginning of the civil war in the US. We study how Facebook users consume information related to two different kinds of narrative: scientific and conspiracy news. We find that although consumers of scientific and conspiracy stories present similar consumption patterns with respect to content, the sizes of the spreading cascades differ. Homogeneity appears to be the primary driver for the diffusion of contents, but each echo chamber has its own cascade dynamics. To mimic these dynamics, we introduce a data-driven percolation model on signed networks.

قيم البحث

اقرأ أيضاً

Social media enabled a direct path from producer to consumer of contents changing the way users get informed, debate, and shape their worldviews. Such a {em disintermediation} weakened consensus on social relevant issues in favor of rumors, mistrust, and fomented conspiracy thinking -- e.g., chem-trails inducing global warming, the link between vaccines and autism, or the New World Order conspiracy. In this work, we study through a thorough quantitative analysis how different conspiracy topics are consumed in the Italian Facebook. By means of a semi-automatic topic extraction strategy, we show that the most discussed contents semantically refer to four specific categories: {em environment}, {em diet}, {em health}, and {em geopolitics}. We find similar patterns by comparing users activity (likes and comments) on posts belonging to different semantic categories. However, if we focus on the lifetime -- i.e., the distance in time between the first and the last comment for each user -- we notice a remarkable difference within narratives -- e.g., users polarized on geopolitics are more persistent in commenting, whereas the less persistent are those focused on diet related topics. Finally, we model users mobility across various topics finding that the more a user is active, the more he is likely to join all topics. Once inside a conspiracy narrative users tend to embrace the overall corpus.
Echo chambers may exclude social media users from being exposed to other opinions, therefore, can cause rampant negative effects. Among abundant evidence are the 2016 and 2020 US presidential elections conspiracy theories and polarization, as well as the COVID-19 disinfodemic. To help better detect echo chambers and mitigate its negative effects, this paper explores the mechanisms and attributes of echo chambers in social media. In particular, we first illustrate four primary mechanisms related to three main factors: human psychology, social networks, and automatic systems. We then depict common attributes of echo chambers with a focus on the diffusion of misinformation, spreading of conspiracy theory, creation of social trends, political polarization, and emotional contagion of users. We illustrate each mechanism and attribute in a multi-perspective of sociology, psychology, and social computing with recent case studies. Our analysis suggest an emerging need to detect echo chambers and mitigate their negative effects.
While social media make it easy to connect with and access information from anyone, they also facilitate basic influence and unfriending mechanisms that may lead to segregated and polarized clusters known as echo chambers. Here we study the condition s in which such echo chambers emerge by introducing a simple model of information sharing in online social networks with the two ingredients of influence and unfriending. Users can change both their opinions and social connections based on the information to which they are exposed through sharing. The model dynamics show that even with minimal amounts of influence and unfriending, the social network rapidly devolves into segregated, homogeneous communities. These predictions are consistent with empirical data from Twitter. Although our findings suggest that echo chambers are somewhat inevitable given the mechanisms at play in online social media, they also provide insights into possible mitigation strategies.
Recent studies have shown that online users tend to select information adhering to their system of beliefs, ignore information that does not, and join groups - i.e., echo chambers - around a shared narrative. Although a quantitative methodology for t heir identification is still missing, the phenomenon of echo chambers is widely debated both at scientific and political level. To shed light on this issue, we introduce an operational definition of echo chambers and perform a massive comparative analysis on more than 1B pieces of contents produced by 1M users on four social media platforms: Facebook, Twitter, Reddit, and Gab. We infer the leaning of users about controversial topics - ranging from vaccines to abortion - and reconstruct their interaction networks by analyzing different features, such as shared links domain, followed pages, follower relationship and commented posts. Our method quantifies the existence of echo-chambers along two main dimensions: homophily in the interaction networks and bias in the information diffusion toward likely-minded peers. We find peculiar differences across social media. Indeed, while Facebook and Twitter present clear-cut echo chambers in all the observed dataset, Reddit and Gab do not. Finally, we test the role of the social media platform on news consumption by comparing Reddit and Facebook. Again, we find support for the hypothesis that platforms implementing news feed algorithms like Facebook may elicit the emergence of echo-chambers.
Social media radically changed how information is consumed and reported. Moreover, social networks elicited a disintermediated access to an unprecedented amount of content. The world health organization (WHO) coined the term infodemics to identify th e information overabundance during an epidemic. Indeed, the spread of inaccurate and misleading information may alter behaviors and complicate crisis management and health responses. This paper addresses information diffusion during the COVID-19 pandemic period with a massive data analysis on YouTube. First, we analyze more than 2M users engagement in 13000 videos released by 68 different YouTube channels, with different political bias and fact-checking indexes. We then investigate the relationship between each users political preference and her/his consumption of questionable/reliable information. Our results, quantified using information theory measures, provide evidence for the existence of echo chambers across two dimensions represented by the political bias and by the trustworthiness of information channels. Finally, we observe that the echo chamber structure cannot be reproduced after properly randomizing the users interaction patterns.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا