ﻻ يوجد ملخص باللغة العربية
Transforms using random matrices have been found to have many applications. We are concerned with the projection of a signal onto Gaussian-distributed random orthogonal bases. We also would like to easily invert the process through transposes in order to facilitate iterative reconstruction. We derive an efficient method to implement random unitary matrices of larger sizes through a set of Givens rotations. Random angles are hierarchically generated on-the-fly and the inverse merely requires traversing the angles in reverse order. Hierarchical randomization of angles also enables reduced storage. Using the random unitary matrices as building blocks we introduce random paraunitary systems (filter banks). We also highlight an efficient implementation of the paraunitary system and of its inverse. We also derive an adaptive under-decimated system, wherein one can control and adapt the amount of projections the signal undergoes, in effect, varying the sampling compression ratio as we go along the signal, without segmenting it. It may locally range from very compressive sampling matrices to (para) unitary random ones. One idea is to adapt to local sparseness characteristics of non-stationary signals.
We introduce a novel random projection technique for efficiently reducing the dimension of very high-dimensional tensors. Building upon classical results on Gaussian random projections and Johnson-Lindenstrauss transforms~(JLT), we propose two tensor
Let $X$ be a $d$-dimensional random vector and $X_theta$ its projection onto the span of a set of orthonormal vectors ${theta_1,...,theta_k}$. Conditions on the distribution of $X$ are given such that if $theta$ is chosen according to Haar measure on
Whilst adversarial attack detection has received considerable attention, it remains a fundamentally challenging problem from two perspectives. First, while threat models can be well-defined, attacker strategies may still vary widely within those cons
We propose two training techniques for improving the robustness of Neural Networks to adversarial attacks, i.e. manipulations of the inputs that are maliciously crafted to fool networks into incorrect predictions. Both methods are independent of the
Random projection is often used to project higher-dimensional vectors onto a lower-dimensional space, while approximately preserving their pairwise distances. It has emerged as a powerful tool in various data processing tasks and has attracted consid