No Arabic abstract
The advent of WWW changed the way we can produce and access information. Recent studies showed that users tend to select information that is consistent with their system of beliefs, forming polarized groups of like-minded people around shared narratives where dissenting information is ignored. In this environment, users cooperate to frame and reinforce their shared narrative making any attempt at debunking inefficient. Such a configuration occurs even in the consumption of news online, and considering that 63% of users access news directly form social media, one hypothesis is that more polarization allows for further spreading of misinformation. Along this path, we focus on the polarization of users around news outlets on Facebook in different European countries (Italy, France, Spain and Germany). First, we compare the pages posting behavior and the users interacting patterns across countries and observe different posting, liking and commenting rates. Second, we explore the tendency of users to interact with different pages (i.e., selective exposure) and the emergence of polarized communities generated around specific pages. Then, we introduce a new metric -- i.e., polarization rank -- to measure polarization of communities for each country. We find that Italy is the most polarized country, followed by France, Germany and lastly Spain. Finally, we present a variation of the Bounded Confidence Model to simulate the emergence of these communities by considering the users engagement and trust on the news. Our findings suggest that trust in information broadcaster plays a pivotal role against polarization of users online.
Political polarization appears to be on the rise, as measured by voting behavior, general affect towards opposing partisans and their parties, and contents posted and consumed online. Research over the years has focused on the role of the Web as a driver of polarization. In order to further our understanding of the factors behind online polarization, in the present work we collect and analyze Web browsing histories of tens of thousands of users alongside careful measurements of the time spent browsing various news sources. We show that online news consumption follows a polarized pattern, where users visits to news sources aligned with their own political leaning are substantially longer than their visits to other news sources. Next, we show that such preferences hold at the individual as well as the population level, as evidenced by the emergence of clear partisan communities of news domains from aggregated browsing patterns. Finally, we tackle the important question of the role of user choices in polarization. Are users simply following the links proffered by their Web environment, or do they exacerbate partisan polarization by intentionally pursuing like-minded news sources? To answer this question, we compare browsing patterns with the underlying hyperlink structure spanned by the considered news domains, finding strong evidence of polarization in partisan browsing habits beyond that which can be explained by the hyperlink structure of the Web.
Facebook News Feed personalization algorithm has a significant impact, on a daily basis, on the lifestyle, mood and opinion of millions of Internet users. Nonetheless, the behavior of such algorithms usually lacks transparency, motivating measurements, modeling and analysis in order to understand and improve its properties. In this paper, we propose a reproducible methodology encompassing measurements and an analytical model to capture the visibility of publishers over a News Feed. First, measurements are used to parameterize and to validate the expressive power of the proposed model. Then, we conduct a what-if analysis to assess the visibility bias incurred by the users against a baseline derived from the model. Our results indicate that a significant bias exists and it is more prominent at the top position of the News Feed. In addition, we found that the bias is non-negligible even for users that are deliberately set as neutral with respect to their political views.
On social media algorithms for content promotion, accounting for users preferences, might limit the exposure to unsolicited contents. In this work, we study how the same contents (videos) are consumed on different platforms -- i.e. Facebook and YouTube -- over a sample of $12M$ of users. Our findings show that the same content lead to the formation of echo chambers, irrespective of the online social network and thus of the algorithm for content promotion. Finally, we show that the users commenting patterns are accurate early predictors for the formation of echo-chambers.
Vaccine hesitancy has been recognized as a major global health threat. Having access to any type of information in social media has been suggested as a potential powerful influence factor to hesitancy. Recent studies in other fields than vaccination show that access to a wide amount of content through the Internet without intermediaries resolved into major segregation of the users in polarized groups. Users select the information adhering to theirs system of beliefs and tend to ignore dissenting information. In this paper we assess whether there is polarization in Social Media use in the field of vaccination. We perform a thorough quantitative analysis on Facebook analyzing 2.6M users interacting with 298.018 posts over a time span of seven years and 5 months. We used community detection algorithms to automatically detect the emergent communities from the users activity and to quantify the cohesiveness over time of the communities. Our findings show that content consumption about vaccines is dominated by the echo-chamber effect and that polarization increased over years. Communities emerge from the users consumption habits, i.e. the majority of users only consumes information in favor or against vaccines, not both. The existence of echo-chambers may explain why social-media campaigns providing accurate information may have limited reach, may be effective only in sub-groups and might even foment further polarization of opinions. The introduction of dissenting information into a sub-group is disregarded and can have a backfire effect, further reinforcing the existing opinions within the sub-group.
The social brain hypothesis fixes to 150 the number of social relationships we are able to maintain. Similar cognitive constraints emerge in several aspects of our daily life, from our mobility up to the way we communicate, and might even affect the way we consume information online. Indeed, despite the unprecedented amount of information we can access online, our attention span still remains limited. Furthermore, recent studies showed the tendency of users to ignore dissenting information but to interact with information adhering to their point of view. In this paper, we quantitatively analyze users attention economy in news consumption on social media by analyzing 14M users interacting with 583 news outlets (pages) on Facebook over a time span of 6 years. In particular, we explore how users distribute their activity across news pages and topics. We find that, independently of their activity, users show the tendency to follow a very limited number of pages. On the other hand, users tend to interact with almost all the topics presented by their favored pages. Finally, we introduce a taxonomy accounting for users behavior to distinguish between patterns of selective exposure and interest. Our findings suggest that segregation of users in echo chambers might be an emerging effect of users activity on social media and that selective exposure -- i.e. the tendency of users to consume information interest coherent with their preferences -- could be a major driver in their consumption patterns.