ترغب بنشر مسار تعليمي؟ اضغط هنا

Analytic expressions for the Cumulative Distribution Function of the Composed Error Term in Stochastic Frontier Analysis with Truncated Normal and Exponential Inefficiencies

83   0   0.0 ( 0 )
 نشر من قبل Rouven Schmidt
 تاريخ النشر 2020
  مجال البحث الاحصاء الرياضي
والبحث باللغة English




اسأل ChatGPT حول البحث

In the stochastic frontier model, the composed error term consists of the measurement error and the inefficiency term. A general assumption is that the inefficiency term follows a truncated normal or exponential distribution. In a wide variety of models evaluating the cumulative distribution function of the composed error term is required. This work introduces and proves four representation theorems for these distributions - two for each distributional assumptions. These representations can be utilized for a fast and accurate evaluation.



قيم البحث

اقرأ أيضاً

This paper revisits the problem of computing empirical cumulative distribution functions (ECDF) efficiently on large, multivariate datasets. Computing an ECDF at one evaluation point requires $mathcal{O}(N)$ operations on a dataset composed of $N$ da ta points. Therefore, a direct evaluation of ECDFs at $N$ evaluation points requires a quadratic $mathcal{O}(N^2)$ operations, which is prohibitive for large-scale problems. Two fast and exact methods are proposed and compared. The first one is based on fast summation in lexicographical order, with a $mathcal{O}(N{log}N)$ complexity and requires the evaluation points to lie on a regular grid. The second one is based on the divide-and-conquer principle, with a $mathcal{O}(Nlog(N)^{(d-1){vee}1})$ complexity and requires the evaluation points to coincide with the input points. The two fast algorithms are described and detailed in the general $d$-dimensional case, and numerical experiments validate their speed and accuracy. Secondly, the paper establishes a direct connection between cumulative distribution functions and kernel density estimation (KDE) for a large class of kernels. This connection paves the way for fast exact algorithms for multivariate kernel density estimation and kernel regression. Numerical tests with the Laplacian kernel validate the speed and accuracy of the proposed algorithms. A broad range of large-scale multivariate density estimation, cumulative distribution estimation, survival function estimation and regression problems can benefit from the proposed numerical methods.
Handling missing values plays an important role in the analysis of survival data, especially, the ones marked by cure fraction. In this paper, we discuss the properties and implementation of stochastic approximations to the expectation-maximization ( EM) algorithm to obtain maximum likelihood (ML) type estimates in situations where missing data arise naturally due to right censoring and a proportion of individuals are immune to the event of interest. A flexible family of three parameter exponentiated-Weibull (EW) distributions is assumed to characterize lifetimes of the non-immune individuals as it accommodates both monotone (increasing and decreasing) and non-monotone (unimodal and bathtub) hazard functions. To evaluate the performance of the SEM algorithm, an extensive simulation study is carried out under various parameter settings. Using likelihood ratio test we also carry out model discrimination within the EW family of distributions. Furthermore, we study the robustness of the SEM algorithm with respect to outliers and algorithm starting values. Few scenarios where stochastic EM (SEM) algorithm outperforms the well-studied EM algorithm are also examined in the given context. For further demonstration, a real survival data on cutaneous melanoma is analyzed using the proposed cure rate model with EW lifetime distribution and the proposed estimation technique. Through this data, we illustrate the applicability of the likelihood ratio test towards rejecting several well-known lifetime distributions that are nested within the wider class of EW distributions.
Statistical modeling of animal movement is of critical importance. The continuous trajectory of an animals movements is only observed at discrete, often irregularly spaced time points. Most existing models cannot handle the unequal sampling interval naturally and/or do not allow inactivity periods such as resting or sleeping. The recently proposed moving-resting (MR) model is a Brownian motion governed by a telegraph process, which allows periods of inactivity in one state of the telegraph process. The MR model shows promise in modeling the movements of predators with long inactive periods such as many felids, but the lack of accommodation of measurement errors seriously prohibits its application in practice. Here we incorporate measurement errors in the MR model and derive basic properties of the model. Inferences are based on a composite likelihood using the Markov property of the chain composed by every other observed increments. The performance of the method is validated in finite sample simulation studies. Application to the movement data of a mountain lion in Wyoming illustrates the utility of the method.
102 - Giles Hooker , Hanlin Shang 2020
This paper presents tests to formally choose between regression models using different derivatives of a functional covariate in scalar-on-function regression. We demonstrate that for linear regression, models using different derivatives can be nested within a model that includes point-impact effects at the end-points of the observed functions. Contrasts can then be employed to test the specification of different derivatives. When nonlinear regression models are defined, we apply a $J$ test to determine the statistical significance of the nonlinear structure between a functional covariate and a scalar response. The finite-sample performance of these methods is verified in simulation, and their practical application is demonstrated using a chemometric data set.
The characteristic function of the folded normal distribution and its moment function are derived. The entropy of the folded normal distribution and the Kullback--Leibler from the normal and half normal distributions are approximated using Taylor ser ies. The accuracy of the results are also assessed using different criteria. The maximum likelihood estimates and confidence intervals for the parameters are obtained using the asymptotic theory and bootstrap method. The coverage of the confidence intervals is also examined.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا