ترغب بنشر مسار تعليمي؟ اضغط هنا

Statistical issues in Serial Killer Nurse cases

99   0   0.0 ( 0 )
 نشر من قبل Richard D. Gill
 تاريخ النشر 2021
  مجال البحث الاحصاء الرياضي
والبحث باللغة English




اسأل ChatGPT حول البحث

We discuss statistical issues in cases of serial killer nurses, focussing on the Dutch case of the nurse Lucia de Berk, arrested under suspicion of murder in 2001, convicted to life imprisonment, but declared innocent in 2010; and the case of the English nurse Ben Geen, arrested in 2004, also given a life sentence. At the trial of Ben Geen, a statistical expert was refused permission to present evidence on statistical biases concerning the way suspicious cases were identified by a hospital team of investigators. The judge ruled that the experts written evidence was merely common sense. An application to the CCRC to review the case was turned down, since the application only presented statistical evidence but did not re-address the medical evidence presented at the original trials. This rejection has been successfully challenged in court, and the CCRC has withdrawn it. The paper includes some striking new statistical findings on the Ben Geen case as well as giving advice to statisticians involved in future cases, which are not infrequent. Statisticians need to be warned of the pitfalls which await them.



قيم البحث

اقرأ أيضاً

We analyze the time pattern of the activity of a serial killer, who during twelve years had murdered 53 people. The plot of the cumulative number of murders as a function of time is of Devils staircase type. The distribution of the intervals between murders (step length) follows a power law with the exponent of 1.4. We propose a model according to which the serial killer commits murders when neuronal excitation in his brain exceeds certain threshold. We model this neural activity as a branching process, which in turn is approximated by a random walk. As the distribution of the random walk return times is a power law with the exponent 1.5, the distribution of the inter-murder intervals is thus explained. We illustrate analytical results by numerical simulation. Time pattern activity data from two other serial killers further substantiate our analysis.
129 - Louis Lyons 2016
Various statistical issues relevant to searches for new physics or to parameter determination in analyses of data in neutrino experiments are briefly discussed.
154 - Louis Lyons 2014
Given the cost, both financial and even more importantly in terms of human effort, in building High Energy Physics accelerators and detectors and running them, it is important to use good statistical techniques in analysing data. Some of the statisti cal issues that arise in searches for New Physics are discussed briefly. They include topics such as: Should we insist on the 5 sigma criterion for discovery claims? The probability of A, given B, is not the same as the probability of B, given A. The meaning of p-values. What is Wilks Theorem and when does it not apply? How should we deal with the `Look Elsewhere Effect? Dealing with systematics such as background parametrisation. Coverage: What is it and does my method have the correct coverage? The use of p0 versus p1 plots.
We provide accessible insight into the current replication crisis in statistical science, by revisiting the old metaphor of court trial as hypothesis test. Inter alia, we define and diagnose harmful statistical witch-hunting both in justice and scien ce, which extends to the replication crisis itself, where a hunt on p-values is currently underway.
Generalized Chinese Remainder Theorem (CRT) is a well-known approach to solve ambiguity resolution related problems. In this paper, we study the robust CRT reconstruction for multiple numbers from a view of statistics. To the best of our knowledge, i t is the first rigorous analysis on the underlying statistical model of CRT-based multiple parameter estimation. To address the problem, two novel approaches are established. One is to directly calculate a conditional maximum a posteriori probability (MAP) estimation of the residue clustering, and the other is based on a generalized wrapped Gaussian mixture model to iteratively search for MAP of both estimands and clustering. Residue error correcting codes are introduced to improve the robustness further. Experimental results show that the statistical schemes achieve much stronger robustness compared to state-of-the-art deterministic schemes, especially in heavy-noise scenarios.

الأسئلة المقترحة

التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا