ترغب بنشر مسار تعليمي؟ اضغط هنا

On the Importance of Asymmetry and Monotonicity Constraints in Maximal Correlation Analysis

168   0   0.0 ( 0 )
 نشر من قبل Elad Domanovitz
 تاريخ النشر 2019
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

The maximal correlation coefficient is a well-established generalization of the Pearson correlation coefficient for measuring non-linear dependence between random variables. It is appealing from a theoretical standpoint, satisfying Renyis axioms for a measure of dependence. It is also attractive from a computational point of view due to the celebrated alternating conditional expectation algorithm, allowing to compute its empirical version directly from observed data. Nevertheless, from the outset, it was recognized that the maximal correlation coefficient suffers from some fundamental deficiencies, limiting its usefulness as an indicator of estimation quality. Another well-known measure of dependence is the correlation ratio which also suffers from some drawbacks. Specifically, the maximal correlation coefficient equals one too easily whereas the correlation ratio equals zero too easily. The present work recounts some attempts that have been made in the past to alter the definition of the maximal correlation coefficient in order to overcome its weaknesses and then proceeds to suggest a natural variant of the maximal correlation coefficient as well as a modified conditional expectation algorithm to compute it. The proposed dependence measure at the same time resolves the major weakness of the correlation ratio measure and may be viewed as a bridge between these two classical measures.



قيم البحث

اقرأ أيضاً

Based on the notion of maximal correlation, Kimeldorf, May and Sampson (1980) introduce a measure of correlation between two random variables, called the concordant monotone correlation (CMC). We revisit, generalize and prove new properties of this m easure of correlation. It is shown that CMC captures various types of correlation detected in measures of rank correlation like the Kendall tau correlation. We show that the CMC satisfies the data processing and tensorization properties (that make ordinary maximal correlation applicable to problems in information theory). Furthermore, CMC is shown to be intimately related to the FKG inequality. Furthermore, a combinatorical application of CMC is given for which we do not know of another method to derive its result. Finally, we study the problem of the complexity of the computation of the CMC, which is a non-convex optimization problem with local maximas. We give a simple but exponential-time algorithm that is guaranteed to output the exact value of the generalized CMC.
We consider the question of estimating a real low-complexity signal (such as a sparse vector or a low-rank matrix) from the phase of complex random measurements. We show that in this phase-only compressive sensing (PO-CS) scenario, we can perfectly r ecover such a signal with high probability and up to global unknown amplitude if the sensing matrix is a complex Gaussian random matrix and if the number of measurements is large compared to the complexity level of the signal space. Our approach proceeds by recasting the (non-linear) PO-CS scheme as a linear compressive sensing model built from a signal normalization constraint, and a phase-consistency constraint imposing any signal estimate to match the observed phases in the measurement domain. Practically, stable and robust signal direction estimation is achieved from any instance optimal algorithm of the compressive sensing literature (such as basis pursuit denoising). This is ensured by proving that the matrix associated with this equivalent linear model satisfies with high probability the restricted isometry property under the above condition on the number of measurements. We finally observe experimentally that robust signal direction recovery is reached at about twice the number of measurements needed for signal recovery in compressive sensing.
We consider upper bounds on the error probability in channel coding. We derive an improved maximum-likelihood union bound, which takes into account events where the likelihood of the correct codeword is tied with that of some competitors. We compare this bound to various previous results, both qualitatively and quantitatively. With respect to maximal error probability of linear codes, we observe that when the channel is additive, the derivation of bounds, as well as the assumptions on the admissible encoder and decoder, simplify considerably.
Recently, the authors studied the connection between each maximal monotone operator T and a family H(T) of convex functions. Each member of this family characterizes the operator and satisfies two particular inequalities. The aim of this paper is t o establish the converse of the latter fact. Namely, that every convex function satisfying those two particular inequalities is associated to a unique maximal monotone operator.
In communication networks, cooperative strategies are coding schemes where network nodes work together to improve network performance metrics such as the total rate delivered across the network. This work studies encoder cooperation in the setting of a discrete multiple access channel (MAC) with two encoders and a single decoder. A network node, here called the cooperation facilitator (CF), that is connected to both encoders via rate-limited links, enables the cooperation strategy. Previous work by the authors presents two classes of MACs: (i) one class where the average-error sum-capacity has an infinite derivative in the limit where CF output link capacities approach zero, and (ii) a second class of MACs where the maximal-error sum-capacity is not continuous at the point where the output link capacities of the CF equal zero. This work contrasts the power of the CF in the maximal- and average-error cases, showing that a constant number of bits communicated over the CF output link can yield a positive gain in the maximal-error sum-capacity, while a far greater number of bits, even numbers that grow sublinearly in the blocklength, can never yield a non-negligible gain in the average-error sum-capacity.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا