Do you want to publish a course? Click here

Emergent limits of an indirect measurement from phase transitions of inference

178   0   0.0 ( 0 )
 Added by Satoru Tokuda
 Publication date 2020
  fields Physics
and research's language is English




Ask ChatGPT about the research

Measurements are inseparable from inference, where the estimation of signals of interest from other observations is called an indirect measurement. While a variety of measurement limits have been defined by the physical constraint on each setup, the fundamental limit of an indirect measurement is essentially the limit of inference. Here, we propose the concept of statistical limits on indirect measurement: the bounds of distinction between signals and noise and between a signal and another signal. By developing the asymptotic theory of Bayesian regression, we investigate the phenomenology of a typical indirect measurement and demonstrate the existence of these limits. Based on the connection between inference and statistical physics, we also provide a unified interpretation in which these limits emerge from phase transitions of inference. Our results could pave the way for novel experimental design, enabling assess to the required quality of observations according to the assumed ground truth before the concerned indirect measurement is actually performed.

rate research

Read More

Machine learning-inspired techniques have emerged as a new paradigm for analysis of phase transitions in quantum matter. In this work, we introduce a supervised learning algorithm for studying critical phenomena from measurement data, which is based on iteratively training convolutional networks of increasing complexity, and test it on the transverse field Ising chain and q=6 Potts model. At the continuous Ising transition, we identify scaling behavior in the classification accuracy, from which we infer a characteristic classification length scale. It displays a power-law divergence at the critical point, with a scaling exponent that matches with the diverging correlation length. Our algorithm correctly identifies the thermodynamic phase of the system and extracts scaling behavior from projective measurements, independently of the basis in which the measurements are performed. Furthermore, we show the classification length scale is absent for the $q=6$ Potts model, which has a first order transition and thus lacks a divergent correlation length. The main intuition underlying our finding is that, for measurement patches of sizes smaller than the correlation length, the system appears to be at the critical point, and therefore the algorithm cannot identify the phase from which the data was drawn.
We numerically investigate the structure of many-body wave functions of 1D random quantum circuits with local measurements employing the participation entropies. The leading term in system size dependence of participation entropies indicates a multifractal scaling of the wave-functions at any non-zero measurement rate. The sub-leading term contains universal information about measurement--induced phase transitions and plays the role of an order parameter, being non-zero in the error-correcting phase and vanishing in the quantum Zeno phase. We provide an analytical interpretation of this behavior expressing the participation entropy in terms of partition functions of classical statistical models in 2D.
Electronic transport is at the heart of many phenomena in condensed matter physics and material science. Magnetic imaging is a non-invasive tool for detecting electric current in materials and devices. A two-dimensional current density can be reconstructed from an image of a single component of the magnetic field produced by the current. In this work, we approach the reconstruction problem in the framework of Bayesian inference, i.e. we solve for the most likely current density given an image obtained by a magnetic probe. To enforce a sensible current density priors are used to associate a cost with unphysical features such as pixel-to-pixel oscillations or current outside the device boundary. Beyond previous work, our approach does not require analytically tractable priors and therefore creates flexibility to use priors that have not been explored in the context of current reconstruction. Here, we implement several such priors that have desirable properties. A challenging aspect of imposing a prior is choosing the optimal strength. We describe an empirical way to determine the appropriate strength of the prior. We test our approach on numerically generated examples. Our code is released in an open-source texttt{python} package called texttt{pysquid}.
85 - Jer^ome Tubiana 2016
Extracting automatically the complex set of features composing real high-dimensional data is crucial for achieving high performance in machine--learning tasks. Restricted Boltzmann Machines (RBM) are empirically known to be efficient for this purpose, and to be able to generate distributed and graded representations of the data. We characterize the structural conditions (sparsity of the weights, low effective temperature, nonlinearities in the activation functions of hidden units, and adaptation of fields maintaining the activity in the visible layer) allowing RBM to operate in such a compositional phase. Evidence is provided by the replica analysis of an adequate statistical ensemble of random RBMs and by RBM trained on the handwritten digits dataset MNIST.
While the study of graphs has been very popular, simplicial complexes are relatively new in the network science community. Despite being are a source of rich information, graphs are limited to pairwise interactions. However, several real world networks such as social networks, neuronal networks etc. involve simultaneous interactions between more than two nodes. Simplicial complexes provide a powerful mathematical way to model such interactions. Now, the spectrum of the graph Laplacian is known to be indicative of community structure, with nonzero eigenvectors encoding the identity of communities. Here, we propose that the spectrum of the Hodge Laplacian, a higher-order Laplacian applied to simplicial complexes, encodes simplicial communities. We formulate an algorithm to extract simplicial communities (of arbitrary dimension). We apply this algorithm on simplicial complex benchmarks and on real data including social networks and language-networks, where higher-order relationships are intrinsic. Additionally, datasets for simplicial complexes are scarce. Hence, we introduce a method of optimally generating a simplicial complex from its network backbone through estimating the textit{true} higher-order relationships when its community structure is known. We do so by using the adjusted mutual information to identify the configuration that best matches the expected data partition. Lastly, we demonstrate an example of persistent simplicial communities inspired by the field of persistence homology.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا