ترغب بنشر مسار تعليمي؟ اضغط هنا

Convex ordering for random vectors using predictable representation

502   0   0.0 ( 0 )
 نشر من قبل Marc Arnaudon
 تاريخ النشر 2008
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

We prove convex ordering results for random vectors admitting a predictable representation in terms of a Brownian motion and a non-necessarily independent jump component. Our method uses forward-backward stochastic calculus and extends previous results in the one-dimensional case. We also study a geometric interpretation of convex ordering for discrete measures in connection with the conditions set on the jump heights and intensities of the considered processes.



قيم البحث

اقرأ أيضاً

312 - Xinjia Chen 2013
We derive simple concentration inequalities for bounded random vectors, which generalize Hoeffdings inequalities for bounded scalar random variables. As applications, we apply the general results to multinomial and Dirichlet distributions to obtain multivariate concentration inequalities.
A definition of $d$--dimensional $n$--Meixner random vectors is given first. This definition involves the commutators of their semi--quantum operators. After that we will focus on the $1$-Meixner random vectors, and derive a system of $d$ partial dif ferential equations satisfied by their Laplace transform. We provide a set of necessary conditions for this system to be integrable. We use these conditions to give a complete characterization of all non--degenerate three--dimensional $1$--Meixner random vectors. It must be mentioned that the three--dimensional case produces the first example in which the components of a $1$--Meixner random vector cannot be reduced, via an injective linear transformation, to three independent classic Meixner random variables.
Adaptive Monte Carlo methods are very efficient techniques designed to tune simulation estimators on-line. In this work, we present an alternative to stochastic approximation to tune the optimal change of measure in the context of importance sampling for normal random vectors. Unlike stochastic approximation, which requires very fine tuning in practice, we propose to use sample average approximation and deterministic optimization techniques to devise a robust and fully automatic variance reduction methodology. The same samples are used in the sample optimization of the importance sampling parameter and in the Monte Carlo computation of the expectation of interest with the optimal measure computed in the previous step. We prove that this highly dependent Monte Carlo estimator is convergent and satisfies a central limit theorem with the optimal limiting variance. Numerical experiments confirm the performance of this estimator: in comparison with the crude Monte Carlo method, the computation time needed to achieve a given precision is divided by a factor between 3 and 15.
147 - Elizabeth Meckes 2009
Let $X$ be a $d$-dimensional random vector and $X_theta$ its projection onto the span of a set of orthonormal vectors ${theta_1,...,theta_k}$. Conditions on the distribution of $X$ are given such that if $theta$ is chosen according to Haar measure on the Stiefel manifold, the bounded-Lipschitz distance from $X_theta$ to a Gaussian distribution is concentrated at its expectation; furthermore, an explicit bound is given for the expected distance, in terms of $d$, $k$, and the distribution of $X$, allowing consideration not just of fixed $k$ but of $k$ growing with $d$. The results are applied in the setting of projection pursuit, showing that most $k$-dimensional projections of $n$ data points in $R^d$ are close to Gaussian, when $n$ and $d$ are large and $k=csqrt{log(d)}$ for a small constant $c$.
Let $K subset R^d$ be a smooth convex set and let $P_la$ be a Poisson point process on $R^d$ of intensity $la$. The convex hull of $P_la cap K$ is a random convex polytope $K_la$. As $la to infty$, we show that the variance of the number of $k$-dimen sional faces of $K_la$, when properly scaled, converges to a scalar multiple of the affine surface area of $K$. Similar asymptotics hold for the variance of the number of $k$-dimensional faces for the convex hull of a binomial process in $K$.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا