ترغب بنشر مسار تعليمي؟ اضغط هنا

Automated Experiments on Ad Privacy Settings: A Tale of Opacity, Choice, and Discrimination

130   0   0.0 ( 0 )
 نشر من قبل Michael Tschantz
 تاريخ النشر 2014
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

To partly address peoples concerns over web tracking, Google has created the Ad Settings webpage to provide information about and some choice over the profiles Google creates on users. We present AdFisher, an automated tool that explores how user behaviors, Googles ads, and Ad Settings interact. AdFisher can run browser-based experiments and analyze data using machine learning and significance tests. Our tool uses a rigorous experimental design and statistical analysis to ensure the statistical soundness of our results. We use AdFisher to find that the Ad Settings was opaque about some features of a users profile, that it does provide some choice on ads, and that these choices can lead to seemingly discriminatory ads. In particular, we found that visiting webpages associated with substance abuse changed the ads shown but not the settings page. We also found that setting the gender to female resulted in getting fewer instances of an ad related to high paying jobs than setting it to male. We cannot determine who caused these findings due to our limited visibility into the ad ecosystem, which includes Google, advertisers, websites, and users. Nevertheless, these results can form the starting point for deeper investigations by either the companies themselves or by regulatory bodies.

قيم البحث

اقرأ أيضاً

Googles Ad Settings shows the gender and age that Google has inferred about a web user. We compare the inferred values to the self-reported values of 501 survey participants. We find that Google often does not show an inference, but when it does, it is typically correct. We explore which usage characteristics, such as using privacy enhancing technologies, are associated with Googles accuracy, but found no significant results.
Recent work has demonstrated that by monitoring the Real Time Bidding (RTB) protocol, one can estimate the monetary worth of different users for the programmatic advertising ecosystem, even when the so-called winning bids are encrypted. In this paper we describe how to implement the above techniques in a practical and privacy preserving manner. Specifically, we study the privacy consequences of reporting back to a centralized server, features that are necessary for estimating the value of encrypted winning bids. We show that by appropriately modulating the granularity of the necessary information and by scrambling the communication channel to the server, one can increase the privacy performance of the system in terms of K-anonymity. Weve implemented the above ideas on a browser extension and disseminated it to some 200 users. Analyzing the results from 6 months of deployment, we show that the average value of users for the programmatic advertising ecosystem has grown more than 75% in the last 3 years.
Online advertising fuels the (seemingly) free internet. However, although users can access most of the web services free of charge, they pay a heavy coston their privacy. They are forced to trust third parties and intermediaries, who not only collect behavioral data but also absorb great amounts of ad revenues. Consequently, more and more users opt out from advertising by resorting to ad blockers, thus costing publishers millions of dollars in lost ad revenues. Albeit there are various privacy-preserving advertising proposals (e.g.,Adnostic, Privad, Brave Ads) from both academia and industry, they all rely on centralized management that users have to blindly trust without being able to audit, while they also fail to guarantee the integrity of the per-formance analytics they provide to advertisers. In this paper, we design and deploy THEMIS, a novel, decentralized and privacy-by-design ad platform that requires zero trust by users. THEMIS (i) provides auditability to its participants, (ii) rewards users for viewing ads, and (iii) allows advertisers to verify the performance and billing reports of their ad campaigns. By leveraging smart contracts and zero-knowledge schemes, we implement a prototype of THEMIS and early performance evaluation results show that it can scale linearly on a multi sidechain setup while it supports more than 51M users on a single-sidechain.
Differential privacy has become a de facto standard for releasing data in a privacy-preserving way. Creating a differentially private algorithm is a process that often starts with a noise-free (non-private) algorithm. The designer then decides where to add noise, and how much of it to add. This can be a non-trivial process -- if not done carefully, the algorithm might either violate differential privacy or have low utility. In this paper, we present DPGen, a program synthesizer that takes in non-private code (without any noise) and automatically synthesizes its differentially private version (with carefully calibrated noise). Under the hood, DPGen uses novel algorithms to automatically generate a sketch program with candidate locations for noise, and then optimize privacy proof and noise scales simultaneously on the sketch program. Moreover, DPGen can synthesize sophisticated mechanisms that adaptively process queries until a specified privacy budget is exhausted. When evaluated on standard benchmarks, DPGen is able to generate differentially private mechanisms that optimize simple utility functions within 120 seconds. It is also powerful enough to synthesize adaptive privacy mechanisms.
90 - Qiang Tang 2020
In the current COVID-19 pandemic, manual contact tracing has been proven very helpful to reach close contacts of infected users and slow down virus spreading. To improve its scalability, a number of automated contact tracing (ACT) solutions have prop osed and some of them have been deployed. Despite the dedicated efforts, security and privacy issues of these solutions are still open and under intensive debate. In this paper, we examine the ACT concept from a broader perspective, by focusing on not only security and privacy issues but also functional issues such as interface, usability and coverage. We first elaborate on these issues and particularly point out the inevitable privacy leakages in existing BLE-based ACT solutions. Then, we propose a venue-based ACT concept, which only monitors users contacting history in virus-spreading-prone venues and is able to incorporate different location tracking technologies such as BLE and WIFI. Finally, we instantiate the venue-based ACT concept and show that our instantiation can mitigate most of the issues we have identified in our analysis.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا