ترغب بنشر مسار تعليمي؟ اضغط هنا

A Robust Statistical method to Estimate the Intervention Effect with Longitudinal Data

61   0   0.0 ( 0 )
 نشر من قبل Mohammad Islam
 تاريخ النشر 2020
  مجال البحث الاحصاء الرياضي
والبحث باللغة English




اسأل ChatGPT حول البحث

Segmented regression is a standard statistical procedure used to estimate the effect of a policy intervention on time series outcomes. This statistical method assumes the normality of the outcome variable, a large sample size, no autocorrelation in the observations, and a linear trend over time. Also, segmented regression is very sensitive to outliers. In a small sample study, if the outcome variable does not follow a Gaussian distribution, then using segmented regression to estimate the intervention effect leads to incorrect inferences. To address the small sample problem and non-normality in the outcome variable, including outliers, we describe and develop a robust statistical method to estimate the policy intervention effect in a series of longitudinal data. A simulation study is conducted to demonstrate the effect of outliers and non-normality in the outcomes by calculating the power of the test statistics with the segmented regression and the proposed robust statistical methods. Moreover, since finding the sampling distribution of the proposed robust statistic is analytically difficult, we use a nonparametric bootstrap technique to study the properties of the sampling distribution and make statistical inferences. Simulation studies show that the proposed method has more power than the standard t-test used in segmented regression analysis under the non-normality error distribution. Finally, we use the developed technique to estimate the intervention effect of the Istanbul Declaration on illegal organ activities. The robust method detected more significant effects compared to the standard method and provided shorter confidence intervals.



قيم البحث

اقرأ أيضاً

The common fisheries policy aims at eliminating discarding which has been part of fisheries for centuries. It is important to monitor the compliance with the new regulations but estimating the discard rate is a challenging task, especially where the practise is illegal. The aim of this study was to review a length-based method that has been used to estimate the discard rate in Icelandic waters and explore the effects of different monitoring schemes. The length-based method estimates the minimum discard rate and the method of bootstrapping can be used to determine the uncertainty of the estimate. This study showed that the number of ships is the most important factor to consider in order to decrease the uncertainty.
The determination of the infection fatality rate (IFR) for the novel SARS-CoV-2 coronavirus is a key aim for many of the field studies that are currently being undertaken in response to the pandemic. The IFR together with the basic reproduction numbe r $R_0$, are the main epidemic parameters describing severity and transmissibility of the virus, respectively. The IFR can be also used as a basis for estimating and monitoring the number of infected individuals in a population, which may be subsequently used to inform policy decisions relating to public health interventions and lockdown strategies. The interpretation of IFR measurements requires the calculation of confidence intervals. We present a number of statistical methods that are relevant in this context and develop an inverse problem formulation to determine correction factors to mitigate time-dependent effects that can lead to biased IFR estimates. We also review a number of methods to combine IFR estimates from multiple independent studies, provide example calculations throughout this note and conclude with a summary and best practice recommendations. The developed code is available online.
This work is motivated by the Obepine French system for SARS-CoV-2 viral load monitoring in wastewater. The objective of this work is to identify, from time-series of noisy measurements, the underlying auto-regressive signals, in a context where the measurements present numerous missing data, censoring and outliers. We propose a method based on an auto-regressive model adapted to censored data with outliers. Inference and prediction are produced via a discretised smoother. This method is both validated on simulations and on real data from Obepine. The proposed method is used to denoise measurements from the quantification of the SARS-CoV-2 E gene in wastewater by RT-qPCR. The resulting smoothed signal shows a good correlation with other epidemiological indicators and an estimate of the whole system noise is produced.
The Coronavirus Disease 2019 (COVID-19) pandemic has caused tremendous amount of deaths and a devastating impact on the economic development all over the world. Thus, it is paramount to control its further transmission, for which purpose it is necess ary to find the mechanism of its transmission process and evaluate the effect of different control strategies. To deal with these issues, we describe the transmission of COVID-19 as an explosive Markov process with four parameters. The state transitions of the proposed Markov process can clearly disclose the terrible explosion and complex heterogeneity of COVID-19. Based on this, we further propose a simulation approach with heterogeneous infections. Experimentations show that our approach can closely track the real transmission process of COVID-19, disclose its transmission mechanism, and forecast the transmission under different non-drug intervention strategies. More importantly, our approach can helpfully develop effective strategies for controlling COVID-19 and appropriately compare their control effect in different countries/cities.
We present a method to build a probability density function (pdf) for the age of a star based on its peculiar velocities $U$, $V$ and $W$ and its orbital eccentricity. The sample used in this work comes from the Geneva-Copenhagen Survey (GCS) which c ontains both the spatial velocities, orbital eccentricities and isochronal ages for about $14,000$ stars. Using the GCS stars, we fitted the parameters that describe the relations between the distributions of kinematical properties and age. This parametrization allows us to obtain an age probability from the kinematical data. From this age pdf, we estimate an individual average age for the star using the most likely age and the expected age. We have obtained the stellar age pdf for the age of $9,102$ stars from the GCS and have shown that the distribution of individual ages derived from our method is in good agreement with the distribution of isochronal ages. We also observe a decline in the mean metallicity with our ages for stars younger than 7 Gyr, similar to the one observed for isochronal ages. This method can be useful for the estimation of rough stellar ages for those stars that fall in areas of the HR diagram where isochrones are tightly crowded. As an example of this method, we estimate the age of Trappist-1, which is a M8V star, obtaining the age of $t(UVW) = 12.50(+0.29-6.23)$ Gyr.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا