ﻻ يوجد ملخص باللغة العربية
The relationship between period doubling bifurcations and Feigenbaums constants has been studied for nearly 40 years and this relationship has helped uncover many fundamental aspects of universal scaling across multiple nonlinear dynamical systems. This paper will combine information entropy with symbolic dynamics to demonstrate how period doubling can be defined using these tools alone. In addition, the technique allows us to uncover some unexpected, simple estimates for Feigenbaums constants which relate them to log 2 and the golden ratio, phi, as well as to each other.
The general idea of information entropy provided by C.E. Shannon hangs over everything we do and can be applied to a great variety of problems once the connection between a distribution and the quantities of interest is found. The Shannon information
We introduce the matrix-based Renyis $alpha$-order entropy functional to parameterize Tishby et al. information bottleneck (IB) principle with a neural network. We term our methodology Deep Deterministic Information Bottleneck (DIB), as it avoids var
The von Neumann graph entropy is a measure of graph complexity based on the Laplacian spectrum. It has recently found applications in various learning tasks driven by networked data. However, it is computational demanding and hard to interpret using
The Large Synoptic Survey Telescope (LSST) will produce an unprecedented amount of light curves using six optical bands. Robust and efficient methods that can aggregate data from multidimensional sparsely-sampled time series are needed. In this paper
We propose a new estimator to measure directed dependencies in time series. The dimensionality of data is first reduced using a new non-uniform embedding technique, where the variables are ranked according to a weighted sum of the amount of new infor