Go to main content
Formats
Format
BibTeX
MARCXML
TextMARC
MARC
DataCite
DublinCore
EndNote
NLM
RefWorks
RIS

Files

Abstract

In this study, we propose a novel generative model that combines the strengths of two well-known generative models: normalizing flows and diffusion models. Normalizing flows lack the ability to fully map to Gaussian space, resulting in limited expressiveness. To overcome this, our model utilizes normalizing flows to map the complex data distribution to a latent distribution and then employs a diffusion model to make the latent distribution achieve equivalence to a Gaussian distribution. Additionally, we introduce a new training procedure that combines maximum likelihood estimation from normalizing flows and variational lower bound from diffusion models, resulting in a unified end-to-end architecture. We evaluate our model using the Fréchet Inception Distance and Negative Log-likelihood scores and show that our model outperforms Neural Spline Flows and gives comparable results to traditional diffusion models. Our work presents a promising direction in the field of generative modeling, specifically in image synthesis.

Details

PDF

Statistics

from
to
Export
Download Full History