ﻻ يوجد ملخص باللغة العربية
We present a novel framework exploiting the cascade of phase transitions occurring during a simulated annealing of the Expectation-Maximisation algorithm to cluster datasets with multi-scale structures. Using the weighted local covariance, we can extract, a posteriori and without any prior knowledge, information on the number of clusters at different scales together with their size. We also study the linear stability of the iterative scheme to derive the threshold at which the first transition occurs and show how to approximate the next ones. Finally, we combine simulated annealing together with recent developments of regularised Gaussian mixture models to learn a principal graph from spatially structured datasets that can also exhibit many scales.
We study the problem of detecting a structured, low-rank signal matrix corrupted with additive Gaussian noise. This includes clustering in a Gaussian mixture model, sparse PCA, and submatrix localization. Each of these problems is conjectured to exhi
We consider the problem of Gaussian mixture clustering in the high-dimensional limit where the data consists of $m$ points in $n$ dimensions, $n,m rightarrow infty$ and $alpha = m/n$ stays finite. Using exact but non-rigorous methods from statistical
Complexity of patterns is a key information for human brain to differ objects of about the same size and shape. Like other innate human senses, the complexity perception cannot be easily quantified. We propose a transparent and universal machine meth
We study the shape, elasticity and fluctuations of the recently predicted (cond-mat/9510172) and subsequently observed (in numerical simulations) (cond-mat/9705059) tubule phase of anisotropic membranes, as well as the phase transitions into and out
Neural networks have been shown to perform incredibly well in classification tasks over structured high-dimensional datasets. However, the learning dynamics of such networks is still poorly understood. In this paper we study in detail the training dy