No Arabic abstract
The field of behavioural biometrics stands as an appealing alternative to more traditional biometric systems due to the ease of use from a user perspective and potential robustness to presentation attacks. This paper focuses its attention to a specific type of behavioural biometric utilising swipe dynamics, also referred to as touch gestures. In touch gesture authentication, a user swipes across the touchscreen of a mobile device to perform an authentication attempt. A key characteristic of touch gesture authentication and new behavioural biometrics in general is the lack of available data to train and validate models. From a machine learning perspective, this presents the classic curse of dimensionality problem and the methodology presented here focuses on Bayesian unsupervised models as they are well suited to such conditions. This paper presents results from a set of experiments consisting of 38 sessions with labelled victim as well as blind and over-the-shoulder presentation attacks. Three models are compared using this dataset; two single-mode models: a shrunk covariance estimate and a Bayesian Gaussian distribution, as well as a Bayesian non-parametric infinite mixture of Gaussians, modelled as a Dirichlet Process. Equal error rates (EER) for the three models are compared and attention is paid to how these vary across the two single-mode models at differing numbers of enrolment samples.
We propose a new Statistical Complexity Measure (SCM) to qualify edge maps without Ground Truth (GT) knowledge. The measure is the product of two indices, an emph{Equilibrium} index $mathcal{E}$ obtained by projecting the edge map into a family of edge patterns, and an emph{Entropy} index $mathcal{H}$, defined as a function of the Kolmogorov Smirnov (KS) statistic. This new measure can be used for performance characterization which includes: (i)~the specific evaluation of an algorithm (intra-technique process) in order to identify its best parameters, and (ii)~the comparison of different algorithms (inter-technique process) in order to classify them according to their quality. Results made over images of the South Florida and Berkeley databases show that our approach significantly improves over Pratts Figure of Merit (PFoM) which is the objective reference-based edge map evaluation standard, as it takes into account more features in its evaluation.
In this paper, a Bayesian semiparametric copula approach is used to model the underlying multivariate distribution $F_{true}$. First, the Dirichlet process is constructed on the unknown marginal distributions of $F_{true}$. Then a Gaussian copula model is utilized to capture the dependence structure of $F_{true}$. As a result, a Bayesian multivariate normality test is developed by combining the relative belief ratio and the Energy distance. Several interesting theoretical results of the approach are derived. Finally, through several simulated examples and a real data set, the proposed approach reveals excellent performance.
Solar flares are large-scale releases of energy in the solar atmosphere, which are characterised by rapid changes in the hydrodynamic properties of plasma from the photosphere to the corona. Solar physicists have typically attempted to understand these complex events using a combination of theoretical models and observational data. From a statistical perspective, there are many challenges associated with making accurate and statistically significant comparisons between theory and observations, due primarily to the large number of free parameters associated with physical models. This class of ill-posed statistical problem is ideally suited to Bayesian methods. In this paper, the solar flare studied by Raftery et al. (2009) is reanalysed using a Bayesian framework. This enables us to study the evolution of the flares temperature, emission measure and energy loss in a statistically self-consistent manner. The Bayesian-based model selection techniques imply that no decision can be made regarding which of the conductive or non-thermal beam heating play the most important role in heating the flare plasma during the impulsive phase of this event.
The cyclical and heterogeneous nature of many substance use disorders highlights the need to adapt the type or the dose of treatment to accommodate the specific and changing needs of individuals. The Adaptive Treatment for Alcohol and Cocaine Dependence study (ENGAGE) is a multi-stage randomized trial that aimed to provide longitudinal data for constructing treatment strategies to improve patients engagement in therapy. However, the high rate of noncompliance and lack of analytic tools to account for noncompliance have impeded researchers from using the data to achieve the main goal of the trial. We overcome this issue by defining our target parameter as the mean outcome under different treatment strategies for given potential compliance strata and propose a Bayesian semiparametric model to estimate this quantity. While it adds substantial complexities to the analysis, one important feature of our work is that we consider partial rather than binary compliance classes which is more relevant in longitudinal studies. We assess the performance of our method through comprehensive simulation studies. We illustrate its application on the ENGAGE study and demonstrate that the optimal treatment strategy depends on compliance strata.
The majority of systems rely on user authentication on passwords, but passwords have so many weaknesses and widespread use that easily raise significant security concerns, regardless of their encrypted form. Users hold the same password for different accounts, administrators never check password files for flaws that might lead to a successful cracking, and the lack of a tight security policy regarding regular password replacement are a few problems that need to be addressed. The proposed research work aims at enhancing this security mechanism, prevent penetrations, password theft, and attempted break-ins towards securing computing systems. The selected solution approach is two-folded; it implements a two-factor authentication scheme to prevent unauthorized access, accompanied by Honeyword principles to detect corrupted or stolen tokens. Both can be integrated into any platform or web application with the use of QR codes and a mobile phone.