ﻻ يوجد ملخص باللغة العربية
The sparsity and compressibility of finite-dimensional signals are of great interest in fields such as compressed sensing. The notion of compressibility is also extended to infinite sequences of i.i.d. or ergodic random variables based on the observed error in their nonlinear k-term approximation. In this work, we use the entropy measure to study the compressibility of continuous-domain innovation processes (alternatively known as white noise). Specifically, we define such a measure as the entropy limit of the doubly quantized (time and amplitude) process. This provides a tool to compare the compressibility of various innovation processes. It also allows us to identify an analogue of the concept of entropy dimension which was originally defined by Renyi for random variables. Particular attention is given to stable and impulsive Poisson innovation processes. Here, our results recognize Poisson innovations as the more compressible ones with an entropy measure far below that of stable innovations. While this result departs from the previous knowledge regarding the compressibility of fat-tailed distributions, our entropy measure ranks stable innovations according to their tail decay.
In this paper, we propose an reconfigurable intelligent surface (RIS) enhanced spectrum sensing system, in which the primary transmitter is equipped with single antenna, the secondary transmitter is equipped with multiple antennas, and the RIS is emp
Due to hardware limitations, the phase shifts of the reflecting elements of reconfigurable intelligent surfaces (RISs) need to be quantized into discrete values. This letter aims to unveil the minimum required number of phase quantization levels $L$
We develop a method for the accurate reconstruction of non-bandlimited finite rate of innovation signals on the sphere. For signals consisting of a finite number of Dirac functions on the sphere, we develop an annihilating filter based method for the
Taylors law quantifies the scaling properties of the fluctuations of the number of innovations occurring in open systems. Urn based modelling schemes have already proven to be effective in modelling this complex behaviour. Here, we present analyt
Denoising stationary process $(X_i)_{i in Z}$ corrupted by additive white Gaussian noise is a classic and fundamental problem in information theory and statistical signal processing. Despite considerable progress in designing efficient denoising algo