Do you want to publish a course? Click here

A Taxonomy of Privacy Constructs for Privacy-Sensitive Robotics

203   0   0.0 ( 0 )
 Added by Matthew Rueben
 Publication date 2017
and research's language is English




Ask ChatGPT about the research

The introduction of robots into our society will also introduce new concerns about personal privacy. In order to study these concerns, we must do human-subject experiments that involve measuring privacy-relevant constructs. This paper presents a taxonomy of privacy constructs based on a review of the privacy literature. Future work in operationalizing privacy constructs for HRI studies is also discussed.



rate research

Read More

News recommendation and personalization is not a solved problem. People are growing concerned of their data being collected in excess in the name of personalization and the usage of it for purposes other than the ones they would think reasonable. Our experience in building personalization products for publishers while adhering to safeguard user privacy led us to investigate more on the user perspective of privacy and personalization. We conducted a survey to explore peoples experience with personalization and privacy and the viewpoints of different age groups. In this paper, we share our major findings with publishers and the community that can inform algorithmic design and implementation of the next generation of news recommender systems, which must put the human at its core and reach a balance between personalization experiences and privacy to reap the benefits of both.
The W3Cs Geolocation API may rapidly standardize the transmission of location information on the Web, but, in dealing with such sensitive information, it also raises serious privacy concerns. We analyze the manner and extent to which the current W3C Geolocation API provides mechanisms to support privacy. We propose a privacy framework for the consideration of location information and use it to evaluate the W3C Geolocation API, both the specification and its use in the wild, and recommend some modifications to the API as a result of our analysis.
Human brain activity collected in the form of Electroencephalography (EEG), even with low number of sensors, is an extremely rich signal. Traces collected from multiple channels and with high sampling rates capture many important aspects of participants brain activity and can be used as a unique personal identifier. The motivation for sharing EEG signals is significant, as a mean to understand the relation between brain activity and well-being, or for communication with medical services. As the equipment for such data collection becomes more available and widely used, the opportunities for using the data are growing; at the same time however inherent privacy risks are mounting. The same raw EEG signal can be used for example to diagnose mental diseases, find traces of epilepsy, and decode personality traits. The current practice of the informed consent of the participants for the use of the data either prevents reuse of the raw signal or does not truly respect participants right to privacy by reusing the same raw data for purposes much different than originally consented to. Here we propose an integration of a personal neuroinformatics system, Smartphone Brain Scanner, with a general privacy framework openPDS. We show how raw high-dimensionality data can be collected on a mobile device, uploaded to a server, and subsequently operated on and accessed by applications or researchers, without disclosing the raw signal. Those extracted features of the raw signal, called answers, are of significantly lower-dimensionality, and provide the full utility of the data in given context, without the risk of disclosing sensitive raw signal. Such architecture significantly mitigates a very serious privacy risk related to raw EEG recordings floating around and being used and reused for various purposes.
Privacy dashboards and transparency tools help users review and manage the data collected about them online. Since 2016, Google has offered such a tool, My Activity, which allows users to review and delete their activity data from Google services. We conducted an online survey with $n = 153$ participants to understand if Googles My Activity, as an example of a privacy transparency tool, increases or decreases end-users concerns and benefits regarding data collection. While most participants were aware of Googles data collection, the volume and detail was surprising, but after exposure to My Activity, participants were significantly more likely to be both less concerned about data collection and to view data collection more beneficially. Only $25,%$ indicated that they would change any settings in the My Activity service or change any behaviors. This suggests that privacy transparency tools are quite beneficial for online services as they garner trust with their users and improve their perceptions without necessarily changing users behaviors. At the same time, though, it remains unclear if such transparency tools actually improve end user privacy by sufficiently assisting or motivating users to change or review data collection settings.
The question we raise through this paper is: Is it economically feasible to trade consumer personal information with their formal consent (permission) and in return provide them incentives (monetary or otherwise)?. In view of (a) the behavioral assumption that humans are `compromising beings and have privacy preferences, (b) privacy as a good not having strict boundaries, and (c) the practical inevitability of inappropriate data leakage by data holders downstream in the data-release supply-chain, we propose a design of regulated efficient/bounded inefficient economic mechanisms for oligopoly data trading markets using a novel preference function bidding approach on a simplified sellers-broker market. Our methodology preserves the heterogeneous privacy preservation constraints (at a grouped consumer, i.e., app, level) upto certain compromise levels, and at the same time satisfies information demand (via the broker) of agencies (e.g., advertising organizations) that collect client data for the purpose of targeted behavioral advertising.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا