ﻻ يوجد ملخص باللغة العربية
We study a phase transition in parameter learning of Hidden Markov Models (HMMs). We do this by generating sequences of observed symbols from given discrete HMMs with uniformly distributed transition probabilities and a noise level encoded in the output probabilities. By using the Baum-Welch (BW) algorithm, an Expectation-Maximization algorithm from the field of Machine Learning, we then try to estimate the parameters of each investigated realization of an HMM. We study HMMs with n=4, 8 and 16 states. By changing the amount of accessible learning data and the noise level, we observe a phase-transition-like change in the performance of the learning algorithm. For bigger HMMs and more learning data, the learning behavior improves tremendously below a certain threshold in the noise strength. For a noise level above the threshold, learning is not possible. Furthermore, we use an overlap parameter applied to the results of a maximum-a-posteriori (Viterbi) algorithm to investigate the accuracy of the hidden state estimation around the phase transition.
Single-molecule force spectroscopy has proven to be a powerful tool for studying the kinetic behavior of biomolecules. Through application of an external force, conformational states with small or transient populations can be stabilized, allowing the
We present an analysis of neural network-based machine learning schemes for phases and phase transitions in theoretical condensed matter research, focusing on neural networks with a single hidden layer. Such shallow neural networks were previously fo
Many diffusion processes in nature and society were found to be anomalous, in the sense of being fundamentally different from conventional Brownian motion. An important example is the migration of biological cells, which exhibits non-trivial temporal
What is the fastest way of finding a randomly hidden target? This question of general relevance is of vital importance for foraging animals. Experimental observations reveal that the search behaviour of foragers is generally intermittent: active sear
Hidden Markov Models (HMMs) are one of the most fundamental and widely used statistical tools for modeling discrete time series. In general, learning HMMs from data is computationally hard (under cryptographic assumptions), and practitioners typicall