ترغب بنشر مسار تعليمي؟ اضغط هنا

A Simulation-free Group Sequential Design with Max-combo Tests in the Presence of Non-proportional Hazards

283   0   0.0 ( 0 )
 نشر من قبل Lili Wang
 تاريخ النشر 2019
  مجال البحث الاحصاء الرياضي
والبحث باللغة English




اسأل ChatGPT حول البحث

Non-proportional hazards (NPH) have been observed recently in many immuno-oncology clinical trials. Weighted log-rank tests (WLRT) with suitably chosen weights can be used to improve the power of detecting the difference of the two survival curves in the presence of NPH. However, it is not easy to choose a proper WLRT in practice when both robustness and efficiency are considered. A versatile maxcombo test was proposed to achieve the balance of robustness and efficiency and has received increasing attentions in both methodology development and application. However, survival trials often warrant interim analyses due to its high cost and long duration. The integration and application of maxcombo tests in interim analyses often require extensive simulation studies. In this paper, we propose a simulation-free approach for group sequential design with maxcombo test in survival trials. The simulation results support that the proposed approaches successfully control both the type I error rate and offer great accuracy and flexibility in estimating sample sizes, at the expense of light computation burden. Notably, our methods display a strong robustness towards various model misspecifications, and have been implemented in an R package for free access online.

قيم البحث

اقرأ أيضاً

59 - Behzad Kianian 2019
Instrumental variables (IV) are a useful tool for estimating causal effects in the presence of unmeasured confounding. IV methods are well developed for uncensored outcomes, particularly for structural linear equation models, where simple two-stage e stimation schemes are available. The extension of these methods to survival settings is challenging, partly because of the nonlinearity of the popular survival regression models and partly because of the complications associated with right censoring or other survival features. We develop a simple causal hazard ratio estimator in a proportional hazards model with right censored data. The method exploits a special characterization of IV which enables the use of an intuitive inverse weighting scheme that is generally applicable to more complex survival settings with left truncation, competing risks, or recurrent events. We rigorously establish the asymptotic properties of the estimators, and provide plug-in variance estimators. The proposed method can be implemented in standard software, and is evaluated through extensive simulation studies. We apply the proposed IV method to a data set from the Prostate, Lung, Colorectal and Ovarian cancer screening trial to delineate the causal effect of flexible sigmoidoscopy screening on colorectal cancer survival which may be confounded by informative noncompliance with the assigned screening regimen.
In this manuscript we demonstrate the analysis of right-censored survival outcomes using the MRH package in R. The MRH package implements the multi-resolution hazard (MRH) model, which is a Polya-tree based, Bayesian semi-parametric method for flexib le estimation of the hazard rate and covariate effects. The package allows for covariates to be included under the proportional and non-proportional hazards assumption, and for robust estimation of the hazard rate in periods of sparsely observed failures via a pruning tool.
Simulation-based optimal design techniques are a convenient tool for solving a particular class of optimal design problems. The goal is to find the optimal configuration of factor settings with respect to an expected utility criterion. This criterion depends on the specified probability model for the data and on the assumed prior distribution for the model parameters. We develop new simulation-based optimal design methods which incorporate likelihood-free approaches and utilize them in novel applications. Most simulation-based design strategies solve the intractable expected utility integral at a specific design point by using Monte Carlo simulations from the probability model. Optimizing the criterion over the design points is carried out in a separate step. Muller (1999) introduces an MCMC algorithm which simultaneously addresses the simulation as well as the optimization problem. In principle, the optimal design can be found by detecting the utility mode of the sampled design points. Several improvements have been suggested to facilitate this task for multidimensional design problems (see e.g. Amzal et al. 2006). We aim to extend this simulation-based design methodology to design problems where the likelihood of the probability model is of an unknown analytical form but it is possible to simulate from the probability model. We further assume that prior observations are available. In such a setting it is seems natural to employ approximate Bayesian computation (ABC) techniques in order to be able to simulate from the conditional probability model. We provide a thorough review of adjacent literature and we investigate the benefits and the limitations of our design methodology for a particular paradigmatic example.
Smooth backfitting has proven to have a number of theoretical and practical advantages in structured regression. Smooth backfitting projects the data down onto the structured space of interest providing a direct link between data and estimator. This paper introduces the ideas of smooth backfitting to survival analysis in a proportional hazard model, where we assume an underlying conditional hazard with multiplicative components. We develop asymptotic theory for the estimator and we use the smooth backfitter in a practical application, where we extend recent advances of in-sample forecasting methodology by allowing more information to be incorporated, while still obeying the structured requirements of in-sample forecasting.
Bayesian inference without the access of likelihood, or likelihood-free inference, has been a key research topic in simulations, to yield a more realistic generation result. Recent likelihood-free inference updates an approximate posterior sequential ly with the dataset of the cumulative simulation input-output pairs over inference rounds. Therefore, the dataset is gathered through the iterative simulations with sampled inputs from a proposal distribution by MCMC, which becomes the key of inference quality in this sequential framework. This paper introduces a new proposal modeling, named as Implicit Surrogate Proposal (ISP), to generate a cumulated dataset with further sample efficiency. ISP constructs the cumulative dataset in the most diverse way by drawing i.i.d samples via a feed-forward fashion, so the posterior inference does not suffer from the disadvantages of MCMC caused by its non-i.i.d nature, such as auto-correlation and slow mixing. We analyze the convergence property of ISP in both theoretical and empirical aspects to guarantee that ISP provides an asymptotically exact sampler. We demonstrate that ISP outperforms the baseline inference algorithms on simulations with multi-modal posteriors.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا