ترغب بنشر مسار تعليمي؟ اضغط هنا

Bayesian decision theory for tree-based adaptive screening tests with an application to youth delinquency

93   0   0.0 ( 0 )
 نشر من قبل Chelsea Krantsevich
 تاريخ النشر 2021
  مجال البحث الاحصاء الرياضي
والبحث باللغة English




اسأل ChatGPT حول البحث

Crime prevention strategies based on early intervention depend on accurate risk assessment instruments for identifying high risk youth. It is important in this context that the instruments be convenient to administer, which means, in particular, that they must be reasonably brief; adaptive screening tests are useful for this purpose. Although item response theory (IRT) bears a long and rich history in producing reliable adaptive tests, adaptive tests constructed using classification and regression trees are becoming a popular alternative to the traditional IRT approach for item selection. On the upside, unlike IRT, tree-based questionnaires require no real-time parameter estimation during administration. On the downside, while item response theory provides robust criteria for terminating the exam, the stopping criterion for a tree-based adaptive test (the maximum tree depth) is unclear. We present a Bayesian decision theory approach for characterizing the trade-offs of administering tree-based questionnaires of different lengths. This formalism involves specifying 1) a utility function measuring the goodness of the assessment; 2) a target population over which this utility should be maximized; 3) an action space comprised of different-length assessments, populated via a tree-fitting algorithm. Using this framework, we provide uncertainty estimates for the trade-offs of shortening the exam, allowing practitioners to determine an optimal exam length in a principled way. The method is demonstrated through an application to youth delinquency risk assessment in Honduras.



قيم البحث

اقرأ أيضاً

Leveraging preclinical animal data for a phase I first-in-man trial is appealing yet challenging. A prior based on animal data may place large probability mass on values of the dose-toxicity model parameter(s), which appear infeasible in light of dat a accrued from the ongoing phase I clinical trial. In this paper, we seek to use animal data to improve decision making in a model-based dose-escalation procedure for phase I oncology trials. Specifically, animal data are incorporated via a robust mixture prior for the parameters of the dose-toxicity relationship. This prior changes dynamically as the trial progresses. After completion of treatment for each cohort, the weight allocated to the informative component, obtained based on animal data alone, is updated using a decision-theoretic approach to assess the commensurability of the animal data with the human toxicity data observed thus far. In particular, we measure commensurability as a function of the utility of optimal prior predictions for the human responses (toxicity or no toxicity) on each administered dose. The proposed methodology is illustrated through several examples and an extensive simulation study. Results show that our proposal can address difficulties in coping with prior-data conflict commencing in sequential trials with a small sample size.
We develop a new methodology for spatial regression of aggregated outputs on multi-resolution covariates. Such problems often occur with spatial data, for example in crop yield prediction, where the output is spatially-aggregated over an area and the covariates may be observed at multiple resolutions. Building upon previous work on aggregated output regression, we propose a regression framework to synthesise the effects of the covariates at different resolutions on the output and provide uncertainty estimation. We show that, for a crop yield prediction problem, our approach is more scalable, via variational inference, than existing multi-resolution regression models. We also show that our framework yields good predictive performance, compared to existing multi-resolution crop yield models, whilst being able to provide estimation of the underlying spatial effects.
Motivated by the analysis of high-dimensional neuroimaging signals located over the cortical surface, we introduce a novel Principal Component Analysis technique that can handle functional data located over a two-dimensional manifold. For this purpos e a regularization approach is adopted, introducing a smoothing penalty coherent with the geodesic distance over the manifold. The model introduced can be applied to any manifold topology, can naturally handle missing data and functional samples evaluated in different grids of points. We approach the discretization task by means of finite element analysis and propose an efficient iterative algorithm for its resolution. We compare the performances of the proposed algorithm with other approaches classically adopted in literature. We finally apply the proposed method to resting state functional magnetic resonance imaging data from the Human Connectome Project, where the method shows substantial differential variations between brain regions that were not apparent with other approaches.
Arctic sea ice plays an important role in the global climate. Sea ice models governed by physical equations have been used to simulate the state of the ice including characteristics such as ice thickness, concentration, and motion. More recent models also attempt to capture features such as fractures or leads in the ice. These simulated features can be partially misaligned or misshapen when compared to observational data, whether due to numerical approximation or incomplete physics. In order to make realistic forecasts and improve understanding of the underlying processes, it is necessary to calibrate the numerical model to field data. Traditional calibration methods based on generalized least-square metrics are flawed for linear features such as sea ice cracks. We develop a statistical emulation and calibration framework that accounts for feature misalignment and misshapenness, which involves optimally aligning model output with observed features using cutting edge image registration techniques. This work can also have application to other physical models which produce coherent structures.
This work is motivated by the Obepine French system for SARS-CoV-2 viral load monitoring in wastewater. The objective of this work is to identify, from time-series of noisy measurements, the underlying auto-regressive signals, in a context where the measurements present numerous missing data, censoring and outliers. We propose a method based on an auto-regressive model adapted to censored data with outliers. Inference and prediction are produced via a discretised smoother. This method is both validated on simulations and on real data from Obepine. The proposed method is used to denoise measurements from the quantification of the SARS-CoV-2 E gene in wastewater by RT-qPCR. The resulting smoothed signal shows a good correlation with other epidemiological indicators and an estimate of the whole system noise is produced.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا