Publications

International Conference

Maximum Likelihood Training of Implicit Nonlinear Diffusion Model
categorize
Machine Learning
Author
Dongjun Kim, Byeonghu Na, Se Jung Kwon, Dongsoo Lee, Wanmo Kang, and Il-Chul Moon
Year
2022
Conference Name
Neural Information Processing Systems (NeurIPS 2022)
Presentation Date
Nov 28-Dec 9
City
New Orleans
Country
USA
File
[NeurIPS22, camera-ready] Maximum Likelihood Training of Implicit Nonlinear Diffusion Models.pdf (12.0M) 30회 다운로드 DATE : 2024-01-16 15:09:24

Dongjun Kim, Byeonghu Na, Se Jung Kwon, Dongsoo Lee, Wanmo Kang, and Il-Chul Moon, Maximum Likelihood Training of Implicit Nonlinear Diffusion Model, Neural Information Processing Systems (NeurIPS 2022), New Orleans, USA, Nov 28-Dec 9, 2022 


Abstract

Whereas diverse variations of diffusion models exist, extending the linear diffusion into a nonlinear diffusion process is investigated by very few works. The nonlinearity effect has been hardly understood, but intuitively, there would be promising diffusion patterns to efficiently train the generative distribution towards the data distribution. This paper introduces a data-adaptive nonlinear diffusion process for score-based diffusion models. The proposed Implicit Nonlinear Diffusion Model (INDM) learns by combining a normalizing flow and a diffusion process. Specifically, INDM implicitly constructs a nonlinear diffusion on the data space by leveraging a linear diffusion on the latent space through a flow network. This flow network is key to forming a nonlinear diffusion, as the nonlinearity depends on the flow network. This flexible nonlinearity improves the learning curve of INDM to nearly Maximum Likelihood Estimation (MLE) against the non-MLE curve of DDPM++, which turns out to be an inflexible version of INDM with the flow fixed as an identity mapping. Also, the discretization of INDM shows the sampling robustness. In experiments, INDM achieves the state-of-the-art FID of 1.75 on CelebA. We release our code at https://github.com/byeonghu-na/INDM. 


@article{kim2022maximum, 

title={Maximum Likelihood Training of Implicit Nonlinear Diffusion Model}, 

author={Kim, Dongjun and Na, Byeonghu and Kwon, Se Jung and Lee, Dongsoo and Kang, Wanmo and Moon, Il-chul}, 

journal={Advances in Neural Information Processing Systems}, 

volume={35}, 

pages={32270--32284}, 

year={2022} 

} 


Source Website:

https://openreview.net/pdf?id=TQn44YPuOR2