ﻻ يوجد ملخص باللغة العربية
For more than a century, fingerprints have been used with considerable success to identify criminals or verify the identity of individuals. The categorical conclusion scheme used by fingerprint examiners, and more generally the inference process followed by forensic scientists, have been heavily criticised in the scientific and legal literature. Instead, scholars have proposed to characterise the weight of forensic evidence using the Bayes factor as the key element of the inference process. In forensic science, quantifying the magnitude of support is equally as important as determining which model is supported. Unfortunately, the complexity of fingerprint patterns render likelihood-based inference impossible. In this paper, we use an Approximate Bayesian Computation model selection algorithm to quantify the weight of fingerprint evidence. We supplement the ABC algorithm using a Receiver Operating Characteristic curve to mitigate the effect of the curse of dimensionality. Our modified algorithm is computationally efficient and makes it easier to monitor convergence as the number of simulations increase. We use our method to quantify the weight of fingerprint evidence in forensic science, but we note that it can be applied to any other forensic pattern evidence.
In this paper we review the concepts of Bayesian evidence and Bayes factors, also known as log odds ratios, and their application to model selection. The theory is presented along with a discussion of analytic, approximate and numerical techniques. S
In a network meta-analysis, some of the collected studies may deviate markedly from the others, for example having very unusual effect sizes. These deviating studies can be regarded as outlying with respect to the rest of the network and can be influ
Models defined by stochastic differential equations (SDEs) allow for the representation of random variability in dynamical systems. The relevance of this class of models is growing in many applied research areas and is already a standard tool to mode
This report is a collection of comments on the Read Paper of Fearnhead and Prangle (2011), to appear in the Journal of the Royal Statistical Society Series B, along with a reply from the authors.
For many decades, statisticians have made attempts to prepare the Bayesian omelette without breaking the Bayesian eggs; that is, to obtain probabilistic likelihood-based inferences without relying on informative prior distributions. A recent example