Do you want to publish a course? Click here

Aspects of optimality of plans orthogonal through other factors

65   0   0.0 ( 0 )
 Added by Sunanda Bagchi
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

The concept of orthogonality through the block factor (OTB), defined in Bagchi (2010), is extended here to orthogonality through a set (say S) of other factors. We discuss the impact of such an orthogonality on the precision of the estimates as well as on the inference procedure. Concentrating on the case when $S$ is of size two, we construct a series of plans in each of which every pair of other factors is orthogonal through a given pair of factors. Next we concentrate on plans through the block factors (POTB). We construct POTBs for symmetrical experiments with two and three-level factors. The plans for two factors are E-optimal, while those for three-level factors are universally optimal. Finally, we construct POTBs for $s^t(s+1)$ experiments, where $s equiv 3 pmod 4$ is a prime power. The plan is universally optimal.



rate research

Read More

Permutation tests are widely used in statistics, providing a finite-sample guarantee on the type I error rate whenever the distribution of the samples under the null hypothesis is invariant to some rearrangement. Despite its increasing popularity and empirical success, theoretical properties of the permutation test, especially its power, have not been fully explored beyond simple cases. In this paper, we attempt to fill this gap by presenting a general non-asymptotic framework for analyzing the power of the permutation test. The utility of our proposed framework is illustrated in the context of two-sample and independence testing under both discrete and continuous settings. In each setting, we introduce permutation tests based on U-statistics and study their minimax performance. We also develop exponential concentration bounds for permuted U-statistics based on a novel coupling idea, which may be of independent interest. Building on these exponential bounds, we introduce permutation tests which are adaptive to unknown smoothness parameters without losing much power. The proposed framework is further illustrated using more sophisticated test statistics including weighted U-statistics for multinomial testing and Gaussian kernel-based statistics for density testing. Finally, we provide some simulation results that further justify the permutation approach.
In this paper we study optimality aspects of a certain type of designs in a multi-way heterogeneity setting. These are ``duals of plans orthogonal through the block factor (POTB). Here by the dual of a main effect plan (say $rho$) we mean a design in a multi-way heterogeneity setting obtained from $rho$ by interchanging the roles of the block factors and the treatment factors. Specifically, we take up two series of universally optimal POTBs for symmetrical experiments constructed in Morgan and Uddin (1996). We show that the duals of these plans, as multi-way designs, satisfy M-optimality. Next, we construct another series of multiway designs and proved their M-optimality, thereby generalising the result of Bagchi and Shah (1989). It may be noted that M-optimality includes all commonly used optimality criteria like A-, D- and E-optimality.
Imposing orthogonal transformations between layers of a neural network has been considered for several years now. This facilitates their learning, by limiting the explosion/vanishing of the gradient; decorrelates the features; improves the robustness. In this framework, this paper studies theoretical properties of orthogonal convolutional layers. More precisely, we establish necessary and sufficient conditions on the layer architecture guaranteeing the existence of an orthogonal convolutional transform. These conditions show that orthogonal convolutional transforms exist for almost all architectures used in practice. Recently, a regularization term imposing the orthogonality of convolutional layers has been proposed. We make the link between this regularization term and orthogonality measures. In doing so, we show that this regularization strategy is stable with respect to numerical and optimization errors and remains accurate when the size of the signals/images is large. This holds for both row and column orthogonality. Finally, we confirm these theoretical results with experiments, and also empirically study the landscape of the regularization term.
161 - Wenjia Wang , Bing-Yi Jing 2021
In this work, we investigate Gaussian process regression used to recover a function based on noisy observations. We derive upper and lower error bounds for Gaussian process regression with possibly misspecified correlation functions. The optimal convergence rate can be attained even if the smoothness of the imposed correlation function exceeds that of the true correlation function and the sampling scheme is quasi-uniform. As byproducts, we also obtain convergence rates of kernel ridge regression with misspecified kernel function, where the underlying truth is a deterministic function. The convergence rates of Gaussian process regression and kernel ridge regression are closely connected, which is aligned with the relationship between sample paths of Gaussian process and the corresponding reproducing kernel Hilbert space.
Results by van der Vaart (1991) from semi-parametric statistics about the existence of a non-zero Fisher information are reviewed in an infinite-dimensional non-linear Gaussian regression setting. Information-theoretically optimal inference on aspects of the unknown parameter is possible if and only if the adjoint of the linearisation of the regression map satisfies a certain range condition. It is shown that this range condition may fail in a commonly studied elliptic inverse problem with a divergence form equation, and that a large class of smooth linear functionals of the conductivity parameter cannot be estimated efficiently in this case. In particular, Gaussian `Bernstein von Mises-type approximations for Bayesian posterior distributions do not hold in this setting.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا