Do you want to publish a course? Click here

The Reliability and Acceptance of Biometric System in Bangladesh: Users Perspective

383   0   0.0 ( 0 )
 Added by Shaykh Siddique
 Publication date 2021
and research's language is English




Ask ChatGPT about the research

Biometric systems are the latest technologies of unique identification. People all over the world prefer to use this unique identification technology for their authentication security. The goal of this research is to evaluate the biometric systems based on system reliability and user satisfaction. As technology fully depends on personal data, so in terms of the quality and reliability of biometric systems, user satisfaction is a principal factor. To walk with the digital era, it is extremely important to assess users concerns about data security as the systems are conducted the authentication by analyzing users personal data. The study shows that users are satisfied by using biometric systems rather than other security systems. Besides, hardware failure is a big issue faced by biometric systems users. Finally, a matrix is generated to compare the performance of popular biometric systems from the users opinions. As system reliability and user satisfaction are the focused issue of this research, biometric service providers can use these phenomena to find what aspect of improvement they need for their services. Also, this study can be a great visualizer for Bangladeshi users, so that they can easily realize which biometric system they have to choose.

rate research

Read More

With the rapid advancement of technology, different biometric user authentication, and identification systems are emerging. Traditional biometric systems like face, fingerprint, and iris recognition, keystroke dynamics, etc. are prone to cyber-attacks and suffer from different disadvantages. Electroencephalography (EEG) based authentication has shown promise in overcoming these limitations. However, EEG-based authentication is less accurate due to signal variability at different psychological and physiological conditions. On the other hand, keystroke dynamics-based identification offers high accuracy but suffers from different spoofing attacks. To overcome these challenges, we propose a novel multimodal biometric system combining EEG and keystroke dynamics. Firstly, a dataset was created by acquiring both keystroke dynamics and EEG signals from 10 users with 500 trials per user at 10 different sessions. Different statistical, time, and frequency domain features were extracted and ranked from the EEG signals and key features were extracted from the keystroke dynamics. Different classifiers were trained, validated, and tested for both individual and combined modalities for two different classification strategies - personalized and generalized. Results show that very high accuracy can be achieved both in generalized and personalized cases for the combination of EEG and keystroke dynamics. The identification and authentication accuracies were found to be 99.80% and 99.68% for Extreme Gradient Boosting (XGBoost) and Random Forest classifiers, respectively which outperform the individual modalities with a significant margin (around 5 percent). We also developed a binary template matching-based algorithm, which gives 93.64% accuracy 6X faster. The proposed method is secured and reliable for any kind of biometric authentication.
Imagine a food recommender system -- how would we check if it is emph{causing} and fostering unhealthy eating habits or merely reflecting users interests? How much of a users experience over time with a recommender is caused by the recommender systems choices and biases, and how much is based on the users preferences and biases? Popularity bias and filter bubbles are two of the most well-studied recommender system biases, but most of the prior research has focused on understanding the system behavior in a single recommendation step. How do these biases interplay with user behavior, and what types of user experiences are created from repeated interactions? In this work, we offer a simulation framework for measuring the impact of a recommender system under different types of user behavior. Using this simulation framework, we can (a) isolate the effect of the recommender system from the user preferences, and (b) examine how the system performs not just on average for an average user but also the extreme experiences under atypical user behavior. As part of the simulation framework, we propose a set of evaluation metrics over the simulations to understand the recommender systems behavior. Finally, we present two empirical case studies -- one on traditional collaborative filtering in MovieLens and one on a large-scale production recommender system -- to understand how popularity bias manifests over time.
Modern electric power grid, known as the Smart Grid, has fast transformed the isolated and centrally controlled power system to a fast and massively connected cyber-physical system that benefits from the revolutions happening in the communications and the fast adoption of Internet of Things devices. While the synergy of a vast number of cyber-physical entities has allowed the Smart Grid to be much more effective and sustainable in meeting the growing global energy challenges, it has also brought with it a large number of vulnerabilities resulting in breaches of data integrity, confidentiality and availability. False data injection (FDI) appears to be among the most critical cyberattacks and has been a focal point interest for both research and industry. To this end, this paper presents a comprehensive review in the recent advances of the defence countermeasures of the FDI attacks in the Smart Grid infrastructure. Relevant existing literature are evaluated and compared in terms of their theoretical and practical significance to the Smart Grid cybersecurity. In conclusion, a range of technical limitations of existing false data attack detection researches are identified, and a number of future research directions are recommended.
Power system frequency could be captured by digital recordings and extracted to compare with a reference database for forensic time-stamp verification. It is known as the electric network frequency (ENF) criterion, enabled by the properties of random fluctuation and intra-grid consistency. In essence, this is a task of matching a short random sequence within a long reference, and the reliability of this criterion is mainly concerned with whether this match could be unique and correct. In this paper, we comprehensively analyze the factors affecting the reliability of ENF matching, including length of test recording, length of reference, temporal resolution, and signal-to-noise ratio (SNR). For synthetic analysis, we incorporate the first-order autoregressive (AR) ENF model and propose an efficient time-frequency domain (TFD) noisy ENF synthesis method. Then, the reliability analysis schemes for both synthetic and real-world data are respectively proposed. Through a comprehensive study we reveal that while the SNR is an important external factor to determine whether time-stamp verification is viable, the length of test recording is the most important inherent factor, followed by the length of reference. However, the temporal resolution has little impact on the matching process.
Enforcing data protection and privacy rules within large data processing applications is becoming increasingly important, especially in the light of GDPR and similar regulatory frameworks. Most modern data processing happens on top of a distributed storage layer, and securing this layer against accidental or malicious misuse is crucial to ensuring global privacy guarantees. However, the performance overhead and the additional complexity for this is often assumed to be significant -- in this work we describe a path forward that tackles both challenges. We propose Software-Defined Data Protection (SDP), an adoption of the Software-Defined Storage approach to non-performance aspects: a trusted controller translates company and application-specific policies to a set of rules deployed on the storage nodes. These, in turn, apply the rules at line-rate but do not take any decisions on their own. Such an approach decouples often changing policies from request-level enforcement and allows storage nodes to implement the latter more efficiently. Even though in-storage processing brings challenges, mainly because it can jeopardize line-rate processing, we argue that todays Smart Storage solutions can already implement the required functionality, thanks to the separation of concerns introduced by SDP. We highlight the challenges that remain, especially that of trusting the storage nodes. These need to be tackled before we can reach widespread adoption in cloud environments.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا