ﻻ يوجد ملخص باللغة العربية
We study the problem of identifying the causal relationship between two discrete random variables from observational data. We recently proposed a novel framework called entropic causality that works in a very general functional model but makes the assumption that the unobserved exogenous variable has small entropy in the true causal direction. This framework requires the solution of a minimum entropy coupling problem: Given marginal distributions of m discrete random variables, each on n states, find the joint distribution with minimum entropy, that respects the given marginals. This corresponds to minimizing a concave function of nm variables over a convex polytope defined by nm linear constraints, called a transportation polytope. Unfortunately, it was recently shown that this minimum entropy coupling problem is NP-hard, even for 2 variables with n states. Even representing points (joint distributions) over this space can require exponential complexity (in n, m) if done naively. In our recent work we introduced an efficient greedy algorithm to find an approximate solution for this problem. In this paper we analyze this algorithm and establish two results: that our algorithm always finds a local minimum and also is within an additive approximation error from the unknown global optimum.
In many common-payoff games, achieving good performance requires players to develop protocols for communicating their private information implicitly -- i.e., using actions that have non-communicative effects on the environment. Multi-agent reinforcem
The degree-based entropy of a graph is defined as the Shannon entropy based on the information functional that associates the vertices of the graph with the corresponding degrees. In this paper, we study extremal problems of finding the graphs attain
Greedy algorithms are popular in compressive sensing for their high computational efficiency. But the performance of current greedy algorithms can be degenerated seriously by noise (both multiplicative noise and additive noise). A robust version of g
We point out a limitation of the mutual information neural estimation (MINE) where the network fails to learn at the initial training phase, leading to slow convergence in the number of training iterations. To solve this problem, we propose a faster
This paper addresses compressive sensing for multi-channel ECG. Compared to the traditional sparse signal recovery approach which decomposes the signal into the product of a dictionary and a sparse vector, the recently developed cosparse approach exp