Do you want to publish a course? Click here

Selective Exposure shapes the Facebook News Diet

78   0   0.0 ( 0 )
 Added by Antonio Scala PhD
 Publication date 2019
and research's language is English




Ask ChatGPT about the research

The social brain hypothesis fixes to 150 the number of social relationships we are able to maintain. Similar cognitive constraints emerge in several aspects of our daily life, from our mobility up to the way we communicate, and might even affect the way we consume information online. Indeed, despite the unprecedented amount of information we can access online, our attention span still remains limited. Furthermore, recent studies showed the tendency of users to ignore dissenting information but to interact with information adhering to their point of view. In this paper, we quantitatively analyze users attention economy in news consumption on social media by analyzing 14M users interacting with 583 news outlets (pages) on Facebook over a time span of 6 years. In particular, we explore how users distribute their activity across news pages and topics. We find that, independently of their activity, users show the tendency to follow a very limited number of pages. On the other hand, users tend to interact with almost all the topics presented by their favored pages. Finally, we introduce a taxonomy accounting for users behavior to distinguish between patterns of selective exposure and interest. Our findings suggest that segregation of users in echo chambers might be an emerging effect of users activity on social media and that selective exposure -- i.e. the tendency of users to consume information interest coherent with their preferences -- could be a major driver in their consumption patterns.



rate research

Read More

On social media algorithms for content promotion, accounting for users preferences, might limit the exposure to unsolicited contents. In this work, we study how the same contents (videos) are consumed on different platforms -- i.e. Facebook and YouTube -- over a sample of $12M$ of users. Our findings show that the same content lead to the formation of echo chambers, irrespective of the online social network and thus of the algorithm for content promotion. Finally, we show that the users commenting patterns are accurate early predictors for the formation of echo-chambers.
Recent studies, targeting Facebook, showed the tendency of users to interact with information adhering to their preferred narrative and to ignore dissenting information. Primarily driven by confirmation bias, users tend to join polarized clusters where they cooperate to reinforce a like-minded system of beliefs, thus facilitating fake news and misinformation cascades. To gain a deeper understanding of these phenomena, in this work we analyze the lexicons used by the communities of users emerging on Facebook around verified and unverified contents. We show how the lexical approach provides important insights about the kind of information processed by the two communities of users and about their overall sentiment. Furthermore, by focusing on comment threads, we observe a strong positive correlation between the lexical convergence of co-commenters and their number of interactions, which in turns suggests that such a trend could be a proxy for the emergence of collective identities and polarization in opinion dynamics.
The advent of WWW changed the way we can produce and access information. Recent studies showed that users tend to select information that is consistent with their system of beliefs, forming polarized groups of like-minded people around shared narratives where dissenting information is ignored. In this environment, users cooperate to frame and reinforce their shared narrative making any attempt at debunking inefficient. Such a configuration occurs even in the consumption of news online, and considering that 63% of users access news directly form social media, one hypothesis is that more polarization allows for further spreading of misinformation. Along this path, we focus on the polarization of users around news outlets on Facebook in different European countries (Italy, France, Spain and Germany). First, we compare the pages posting behavior and the users interacting patterns across countries and observe different posting, liking and commenting rates. Second, we explore the tendency of users to interact with different pages (i.e., selective exposure) and the emergence of polarized communities generated around specific pages. Then, we introduce a new metric -- i.e., polarization rank -- to measure polarization of communities for each country. We find that Italy is the most polarized country, followed by France, Germany and lastly Spain. Finally, we present a variation of the Bounded Confidence Model to simulate the emergence of these communities by considering the users engagement and trust on the news. Our findings suggest that trust in information broadcaster plays a pivotal role against polarization of users online.
Facebook News Feed personalization algorithm has a significant impact, on a daily basis, on the lifestyle, mood and opinion of millions of Internet users. Nonetheless, the behavior of such algorithms usually lacks transparency, motivating measurements, modeling and analysis in order to understand and improve its properties. In this paper, we propose a reproducible methodology encompassing measurements and an analytical model to capture the visibility of publishers over a News Feed. First, measurements are used to parameterize and to validate the expressive power of the proposed model. Then, we conduct a what-if analysis to assess the visibility bias incurred by the users against a baseline derived from the model. Our results indicate that a significant bias exists and it is more prominent at the top position of the News Feed. In addition, we found that the bias is non-negligible even for users that are deliberately set as neutral with respect to their political views.
Privacy in Online Social Networks (OSNs) evolved from a niche topic to a broadly discussed issue in a wide variety of media. Nevertheless, OSNs drastically increase the amount of information that can be found about individuals on the web. To estimate the dimension of data leakage in OSNs, we measure the real exposure of user content of 4,182 Facebook users from 102 countries in the most popular OSN, Facebook. We further quantify the impact of a comprehensible privacy control interface that has been shown to extremely decrease configuration efforts as well as misconfiguration in audience selection. Our study highlights the importance of usable security. (i) The total amount of content that is visible to Facebook users does not dramatically decrease by simplifying the audience selection interface, but the composition of the visible content changes. (ii) Which information is uploaded to Facebook as well as which information is shared with whom strongly depends on the users country of origin.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا