As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
Deep generative models enhanced by Wasserstein distance have achieved remarkable success in recent years. Wasserstein Auto-Encoders (WAEs) are auto-encoder based generative models that aim to minimize the Wasserstein distance between the data distribution and the generated distribution. The quality of generated samples of WAE depends on the distance between the data distribution and the generated distribution. However, WAE actually minimizes a Wasserstein distance between the data distribution and the reconstructed distribution in data space plus a penalty divergence between the aggregated posterior and the prior in latent space, leading a gap between theory and practice. Consequently, the quality of generated samples of WAE is not satisfactory. In this paper, we propose a novel dual rejection sampling method to improve the performance of WAE on the generated samples in the sampling phase. The proposed method first corrects the generative prior by a discriminator based rejection sampling scheme in latent space and then rectifies the generated distribution by another discriminator based rejection sampling method in data space. Our method is validated, both qualitatively and quantitatively, by extensive experiments on three real-world datasets.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.