ترغب بنشر مسار تعليمي؟ اضغط هنا

Auditing the Biases Enacted by YouTube for Political Topics in Germany

66   0   0.0 ( 0 )
 نشر من قبل Hendrik Heuer
 تاريخ النشر 2021
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

With YouTubes growing importance as a news platform, its recommendation system came under increased scrutiny. Recognizing YouTubes recommendation system as a broadcaster of media, we explore the applicability of laws that require broadcasters to give important political, ideological, and social groups adequate opportunity to express themselves in the broadcasted program of the service. We present audits as an important tool to enforce such laws and to ensure that a system operates in the publics interest. To examine whether YouTube is enacting certain biases, we collected video recommendations about political topics by following chains of ten recommendations per video. Our findings suggest that YouTubes recommendation system is enacting important biases. We find that YouTube is recommending increasingly popular but topically unrelated videos. The sadness evoked by the recommended videos decreases while the happiness increases. We discuss the strong popularity bias we identified and analyze the link between the popularity of content and emotions. We also discuss how audits empower researchers and civic hackers to monitor complex machine learning (ML)-based systems like YouTubes recommendation system.



قيم البحث

اقرأ أيضاً

People eager to learn about a topic can access Wikipedia to form a preliminary opinion. Despite the solid revision process behind the encyclopedias articles, the users exploration process is still influenced by the hyperlinks network. In this paper, we shed light on this overlooked phenomenon by investigating how articles describing complementary subjects of a topic interconnect, and thus may shape readers exposure to diverging content. To quantify this, we introduce the exposure to diverse information, a metric that captures how users exposure to multiple subjects of a topic varies click-after-click by leveraging navigation models. For the experiments, we collected six topic-induced networks about polarizing topics and analyzed the extent to which their topologies induce readers to examine diverse content. More specifically, we take two sets of articles about opposing stances (e.g., guns control and guns right) and measure the probability that users move within or across the sets, by simulating their behavior via a Wikipedia-tailored model. Our findings show that the networks hinder users to symmetrically explore diverse content. Moreover, on average, the probability that the networks nudge users to remain in a knowledge bubble is up to an order of magnitude higher than that of exploring pages of contrasting subjects. Taken together, those findings return a new and intriguing picture of Wikipedias network structural influence on polarizing issues exploration.
User beliefs about algorithmic systems are constantly co-produced through user interaction and the complex socio-technical systems that generate recommendations. Identifying these beliefs is crucial because they influence how users interact with reco mmendation algorithms. With no prior work on user beliefs of algorithmic video recommendations, practitioners lack relevant knowledge to improve the user experience of such systems. To address this problem, we conducted semi-structured interviews with middle-aged YouTube video consumers to analyze their user beliefs about the video recommendation system. Our analysis revealed different factors that users believe influence their recommendations. Based on these factors, we identified four groups of user beliefs: Previous Actions, Social Media, Recommender System, and Company Policy. Additionally, we propose a framework to distinguish the four main actors that users believe influence their video recommendations: the current user, other users, the algorithm, and the organization. This framework provides a new lens to explore design suggestions based on the agency of these four actors. It also exposes a novel aspect previously unexplored: the effect of corporate decisions on the interaction with algorithmic recommendations. While we found that users are aware of the existence of the recommendation system on YouTube, we show that their understanding of this system is limited.
Inappropriate and profane content on social media is exponentially increasing and big corporations are becoming more aware of the type of content on which they are advertising and how it may affect their brand reputation. But with a huge surge in con tent being posted online it becomes seemingly difficult to filter out related videos on which they can run their ads without compromising brand name. Advertising on youtube videos generates a huge amount of revenue for corporations. It becomes increasingly important for such corporations to advertise on only the videos that dont hurt the feelings, community or harmony of the audience at large. In this paper, we propose a system to identify inappropriate content on YouTube and leverage it to perform a first of its kind, large scale, quantitative characterization that reveals some of the risks of YouTube ads consumption on inappropriate videos. Customization of the architecture have also been included to serve different requirements of corporations. Our analysis reveals that YouTube is still plagued by such disturbing videos and its currently deployed countermeasures are ineffective in terms of detecting them in a timely manner. Our framework tries to fill this gap by providing a handy, add on solution to filter the videos and help corporations and companies to push ads on the platform without worrying about the content on which the ads are displayed.
Political polarization appears to be on the rise, as measured by voting behavior, general affect towards opposing partisans and their parties, and contents posted and consumed online. Research over the years has focused on the role of the Web as a dr iver of polarization. In order to further our understanding of the factors behind online polarization, in the present work we collect and analyze Web browsing histories of tens of thousands of users alongside careful measurements of the time spent browsing various news sources. We show that online news consumption follows a polarized pattern, where users visits to news sources aligned with their own political leaning are substantially longer than their visits to other news sources. Next, we show that such preferences hold at the individual as well as the population level, as evidenced by the emergence of clear partisan communities of news domains from aggregated browsing patterns. Finally, we tackle the important question of the role of user choices in polarization. Are users simply following the links proffered by their Web environment, or do they exacerbate partisan polarization by intentionally pursuing like-minded news sources? To answer this question, we compare browsing patterns with the underlying hyperlink structure spanned by the considered news domains, finding strong evidence of polarization in partisan browsing habits beyond that which can be explained by the hyperlink structure of the Web.
In the attention economy, video apps employ design mechanisms like autoplay that exploit psychological vulnerabilities to maximize watch time. Consequently, many people feel a lack of agency over their app use, which is linked to negative life effect s such as loss of sleep. Prior design research has innovated external mechanisms that police multiple apps, such as lockout timers. In this work, we shift the focus to how the internal mechanisms of an app can support user agency, taking the popular YouTube mobile app as a test case. From a survey of 120 U.S. users, we find that autoplay and recommendations primarily undermine sense of agency, while search and playlists support it. From 13 co-design sessions, we find that when users have a specific intention for how they want to use YouTube they prefer interfaces that support greater agency. We discuss implications for how designers can help users reclaim a sense of agency over their media use.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا