Do you want to publish a course? Click here

Towards Understanding Connections between Security/Privacy Attitudes and Unlock Authentication

80   0   0.0 ( 0 )
 Added by Adam Aviv
 Publication date 2018
and research's language is English




Ask ChatGPT about the research

In this study, we examine the ways in which user attitudes towards privacy and security relating to mobile devices and the data stored thereon may impact the strength of unlock authentication, focusing on Androids graphical unlock patterns. We conducted an online study with Amazon Mechanical Turk ($N=750$) using self-reported unlock authentication choices, as well as Likert scale agreement/disagreement responses to a set of seven privacy/security prompts. We then analyzed the responses in multiple dimensions, including a straight average of the Likert responses as well as using Principle Component Analysis to expose latent factors. We found that responses to two of the seven questions proved relevant and significant. These two questions considered attitudes towards general concern for data stored on mobile devices, and attitudes towards concerns for unauthorized access by known actors. Unfortunately, larger conclusions cannot be drawn on the efficacy of the broader set of questions for exposing connections between unlock authentication strength (Pearson Rank $r=-0.08$, $p<0.1$). However, both of our factor solutions exposed differences in responses for demographics groups, including age, gender, and residence type. The findings of this study suggests that there is likely a link between perceptions of privacy/security on mobile devices and the perceived threats therein, but more research is needed, particularly on developing better survey and measurement techniques of privacy/security attitudes that relate to mobile devices specifically.



rate research

Read More

Mixed reality (MR) technology development is now gaining momentum due to advances in computer vision, sensor fusion, and realistic display technologies. With most of the research and development focused on delivering the promise of MR, there is only barely a few working on the privacy and security implications of this technology. This survey paper aims to put in to light these risks, and to look into the latest security and privacy work on MR. Specifically, we list and review the different protection approaches that have been proposed to ensure user and data security and privacy in MR. We extend the scope to include work on related technologies such as augmented reality (AR), virtual reality (VR), and human-computer interaction (HCI) as crucial components, if not the origins, of MR, as well as numerous related work from the larger area of mobile devices, wearables, and Internet-of-Things (IoT). We highlight the lack of investigation, implementation, and evaluation of data protection approaches in MR. Further challenges and directions on MR security and privacy are also discussed.
Gamification and Serious Games are progressively being used over a host of fields, particularly to support education. Such games provide a new way to engage students with content and can complement more traditional approaches to learning. This article proposes SherLOCKED, a new serious game created in the style of a 2D top-down puzzle adventure. The game is situated in the context of an undergraduate cyber security course, and is used to consolidate students knowledge of foundational security concepts (e.g. the CIA triad, security threats and attacks and risk management). SherLOCKED was built based on a review of existing serious games and a study of common gamification principles. It was subsequently implemented within an undergraduate course, and evaluated with 112 students. We found the game to be an effective, attractive and fun solution for allowing further engagement with content that students were introduced to during lectures. This research lends additional evidence to the use of serious games in supporting learning about cyber security.
Coronavirus disease 2019, i.e. COVID-19 has imposed the public health measure of keeping social distancing for preventing mass transmission of COVID-19. For monitoring the social distancing and keeping the trace of transmission, we are obligated to develop various types of digital surveillance systems, which include contact tracing systems and drone-based monitoring systems. Due to the inconvenience of manual labor, traditional contact tracing systems are gradually replaced by the efficient automated contact tracing applications that are developed for smartphones. However, the commencement of automated contact tracing applications introduces the inevitable privacy and security challenges. Nevertheless, unawareness and/or lack of smartphone usage among mass people lead to drone-based monitoring systems. These systems also invite unwelcomed privacy and security challenges. This paper discusses the recently designed and developed digital surveillance system applications with their protocols deployed in several countries around the world. Their privacy and security challenges are discussed as well as analyzed from the viewpoint of privacy acts. Several recommendations are suggested separately for automated contact tracing systems and drone-based monitoring systems, which could further be explored and implemented afterwards to prevent any possible privacy violation and protect an unsuspecting person from any potential cyber attack.
Differential privacy protects an individuals privacy by perturbing data on an aggregated level (DP) or individual level (LDP). We report four online human-subject experiments investigating the effects of using different approaches to communicate differential privacy techniques to laypersons in a health app data collection setting. Experiments 1 and 2 investigated participants data disclosure decisions for low-sensitive and high-sensitive personal information when given different DP or LDP descriptions. Experiments 3 and 4 uncovered reasons behind participants data sharing decisions, and examined participants subjective and objective comprehensions of these DP or LDP descriptions. When shown descriptions that explain the implications instead of the definition/processes of DP or LDP technique, participants demonstrated better comprehension and showed more willingness to share information with LDP than with DP, indicating their understanding of LDPs stronger privacy guarantee compared with DP.
Fog computing is an emerging computing paradigm that has come into consideration for the deployment of IoT applications amongst researchers and technology industries over the last few years. Fog is highly distributed and consists of a wide number of autonomous end devices, which contribute to the processing. However, the variety of devices offered across different users are not audited. Hence, the security of Fog devices is a major concern in the Fog computing environment. Furthermore, mitigating and preventing those security measures is a research issue. Therefore, to provide the necessary security for Fog devices, we need to understand what the security concerns are with regards to Fog. All aspects of Fog security, which have not been covered by other literature works needs to be identified and need to be aggregate all issues in Fog security. It needs to be noted that computation devices consist of many ordinary users, and are not managed by any central entity or managing body. Therefore, trust and privacy is also a key challenge to gain market adoption for Fog. To provide the required trust and privacy, we need to also focus on authentication, threats and access control mechanisms as well as techniques in Fog computing. In this paper, we perform a survey and propose a taxonomy, which presents an overview of existing security concerns in the context of the Fog computing paradigm. We discuss the Blockchain-based solutions towards a secure Fog computing environment and presented various research challenges and directions for future research.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا