ﻻ يوجد ملخص باللغة العربية
Variational Autoencoder is a scalable method for learning latent variable models of complex data. It employs a clear objective that can be easily optimized. However, it does not explicitly measure the quality of learned representations. We propose a Variational Mutual Information Maximization Framework for VAE to address this issue. It provides an objective that maximizes the mutual information between latent codes and observations. The objective acts as a regularizer that forces VAE to not ignore the latent code and allows one to select particular components of it to be most informative with respect to the observations. On top of that, the proposed framework provides a way to evaluate mutual information between latent codes and observations for a fixed VAE model.
Learning interpretable and disentangled representations of data is a key topic in machine learning research. Variational Autoencoder (VAE) is a scalable method for learning directed latent variable models of complex data. It employs a clear and inter
Variational autoencoders (VAEs) are one of the powerful likelihood-based generative models with applications in various domains. However, they struggle to generate high-quality images, especially when samples are obtained from the prior without any t
Stochastic processes provide a mathematically elegant way model complex data. In theory, they provide flexible priors over function classes that can encode a wide range of interesting assumptions. In practice, however, efficient inference by optimisa
This paper describes a novel diffusion model, DyDiff-VAE, for information diffusion prediction on social media. Given the initial content and a sequence of forwarding users, DyDiff-VAE aims to estimate the propagation likelihood for other potential u
Although substantial efforts have been made to learn disentangled representations under the variational autoencoder (VAE) framework, the fundamental properties to the dynamics of learning of most VAE models still remain unknown and under-investigated