ترغب بنشر مسار تعليمي؟ اضغط هنا

Conformal Geometry of Sequential Test in Multidimensional Curved Exponential Family

352   0   0.0 ( 0 )
 نشر من قبل Akimichi Takemura
 تاريخ النشر 2014
  مجال البحث الاحصاء الرياضي
والبحث باللغة English




اسأل ChatGPT حول البحث

This article presents a differential geometrical method for analyzing sequential test procedures. It is based on the primal result on the conformal geometry of statistical manifold developed in Kumon, Takemura and Takeuchi (2011). By introducing curvature-type random variables, the condition is first clarified for a statistical manifold to be an exponential family under an appropriate sequential test procedure. This result is further elaborated for investigating the efficient sequential test in a multidimensional curved exponential family. The theoretical results are numerically examined by using von Mises-Fisher and hyperboloid models.



قيم البحث

اقرأ أيضاً

We present a geometrical method for analyzing sequential estimating procedures. It is based on the design principle of the second-order efficient sequential estimation provided in Okamoto, Amari and Takeuchi (1991). By introducing a dual conformal cu rvature quantity, we clarify the conditions for the covariance minimization of sequential estimators. These conditions are further elabolated for the multidimensional curved exponential family. The theoretical results are then numerically examined by using typical statistical models, von Mises-Fisher and hyperboloid models.
Here, in this paper it has been considered a sub family of exponential family. Maximum likelihood estimations (MLE) for the parameter of this family, probability density function, and cumulative density function based on a sample and based on lower r ecord values have been obtained. It has been considered Mean Square Error (MSE) as a criterion for determining which is better in different situations. Additionally, it has been proved some theories about the relations between MLE based on lower record values and based on a random sample. Also, some interesting asymptotically properties for these estimations have been shown during some theories.
Multidimensional Scaling (MDS) is a classical technique for embedding data in low dimensions, still in widespread use today. Originally introduced in the 1950s, MDS was not designed with high-dimensional data in mind; while it remains popular with da ta analysis practitioners, no doubt it should be adapted to the high-dimensional data regime. In this paper we study MDS under modern setting, and specifically, high dimensions and ambient measurement noise. We show that, as the ambient noise level increase, MDS suffers a sharp breakdown that depends on the data dimension and noise level, and derive an explicit formula for this breakdown point in the case of white noise. We then introduce MDS+, an extremely simple variant of MDS, which applies a carefully derived shrinkage nonlinearity to the eigenvalues of the MDS similarity matrix. Under a loss function measuring the embedding quality, MDS+ is the unique asymptotically optimal shrinkage function. We prove that MDS+ offers improved embedding, sometimes significantly so, compared with classical MDS. Furthermore, MDS+ does not require external estimates of the embedding dimension (a famous difficulty in classical MDS), as it calculates the optimal dimension into which the data should be embedded.
We study an online multiple testing problem where the hypotheses arrive sequentially in a stream. The test statistics are independent and assumed to have the same distribution under their respective null hypotheses. We investigate two procedures LORD and LOND, proposed by (Javanmard and Montanari, 2015), which are proved to control the FDR in an online manner. In some (static) model, we show that LORD is optimal in some asymptotic sense, in particular as powerful as the (static) Benjamini-Hochberg procedure to first asymptotic order. We also quantify the performance of LOND. Some numerical experiments complement our theory.
66 - Lara Kassab 2019
Multidimensional scaling (MDS) is a popular technique for mapping a finite metric space into a low-dimensional Euclidean space in a way that best preserves pairwise distances. We study a notion of MDS on infinite metric measure spaces, along with its optimality properties and goodness of fit. This allows us to study the MDS embeddings of the geodesic circle $S^1$ into $mathbb{R}^m$ for all $m$, and to ask questions about the MDS embeddings of the geodesic $n$-spheres $S^n$ into $mathbb{R}^m$. Furthermore, we address questions on convergence of MDS. For instance, if a sequence of metric measure spaces converges to a fixed metric measure space $X$, then in what sense do the MDS embeddings of these spaces converge to the MDS embedding of $X$? Convergence is understood when each metric space in the sequence has the same finite number of points, or when each metric space has a finite number of points tending to infinity. We are also interested in notions of convergence when each metric space in the sequence has an arbitrary (possibly infinite) number of points.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا