No Arabic abstract
We study the problem of identifying the causal relationship between two discrete random variables from observational data. We recently proposed a novel framework called entropic causality that works in a very general functional model but makes the assumption that the unobserved exogenous variable has small entropy in the true causal direction. This framework requires the solution of a minimum entropy coupling problem: Given marginal distributions of m discrete random variables, each on n states, find the joint distribution with minimum entropy, that respects the given marginals. This corresponds to minimizing a concave function of nm variables over a convex polytope defined by nm linear constraints, called a transportation polytope. Unfortunately, it was recently shown that this minimum entropy coupling problem is NP-hard, even for 2 variables with n states. Even representing points (joint distributions) over this space can require exponential complexity (in n, m) if done naively. In our recent work we introduced an efficient greedy algorithm to find an approximate solution for this problem. In this paper we analyze this algorithm and establish two results: that our algorithm always finds a local minimum and also is within an additive approximation error from the unknown global optimum.
In many common-payoff games, achieving good performance requires players to develop protocols for communicating their private information implicitly -- i.e., using actions that have non-communicative effects on the environment. Multi-agent reinforcement learning practitioners typically approach this problem using independent learning methods in the hope that agents will learn implicit communication as a byproduct of expected return maximization. Unfortunately, independent learning methods are incapable of doing this in many settings. In this work, we isolate the implicit communication problem by identifying a class of partially observable common-payoff games, which we call implicit referential games, whose difficulty can be attributed to implicit communication. Next, we introduce a principled method based on minimum entropy coupling that leverages the structure of implicit referential games, yielding a new perspective on implicit communication. Lastly, we show that this method can discover performant implicit communication protocols in settings with very large spaces of messages.
The degree-based entropy of a graph is defined as the Shannon entropy based on the information functional that associates the vertices of the graph with the corresponding degrees. In this paper, we study extremal problems of finding the graphs attaining the minimum degree-based graph entropy among graphs and bipartite graphs with a given number of vertices and edges. We characterize the unique extremal graph achieving the minimum value among graphs with a given number of vertices and edges and present a lower bound for the degree-based entropy of bipartite graphs and characterize all the extremal graphs which achieve the lower bound. This implies the known result due to Cao et al. (2014) that the star attains the minimum value of the degree-based entropy among trees with a given number of vertices.
Greedy algorithms are popular in compressive sensing for their high computational efficiency. But the performance of current greedy algorithms can be degenerated seriously by noise (both multiplicative noise and additive noise). A robust version of greedy cosparse greedy algorithm (greedy analysis pursuit) is presented in this paper. Comparing with previous methods, The proposed robust greedy analysis pursuit algorithm is based on an optimization model which allows both multiplicative noise and additive noise in the data fitting constraint. Besides, a new stopping criterion that is derived. The new algorithm is applied to compressive sensing of ECG signals. Numerical experiments based on real-life ECG signals demonstrate the performance improvement of the proposed greedy algorithms.
We point out a limitation of the mutual information neural estimation (MINE) where the network fails to learn at the initial training phase, leading to slow convergence in the number of training iterations. To solve this problem, we propose a faster method called the mutual information neural entropic estimation (MI-NEE). Our solution first generalizes MINE to estimate the entropy using a custom reference distribution. The entropy estimate can then be used to estimate the mutual information. We argue that the seemingly redundant intermediate step of entropy estimation allows one to improve the convergence by an appropriate reference distribution. In particular, we show that MI-NEE reduces to MINE in the special case when the reference distribution is the product of marginal distributions, but faster convergence is possible by choosing the uniform distribution as the reference distribution instead. Compared to the product of marginals, the uniform distribution introduces more samples in low-density regions and fewer samples in high-density regions, which appear to lead to an overall larger gradient for faster convergence.
This paper addresses compressive sensing for multi-channel ECG. Compared to the traditional sparse signal recovery approach which decomposes the signal into the product of a dictionary and a sparse vector, the recently developed cosparse approach exploits sparsity of the product of an analysis matrix and the original signal. We apply the cosparse Greedy Analysis Pursuit (GAP) algorithm for compressive sensing of ECG signals. Moreover, to reduce processing time, classical signal-channel GAP is generalized to the multi-channel GAP algorithm, which simultaneously reconstructs multiple signals with similar support. Numerical experiments show that the proposed method outperforms the classical sparse multi-channel greedy algorithms in terms of accuracy and the single-channel cosparse approach in terms of processing speed.