Skip to content
forked from changliu00/AWGF

Codes for "Understanding and Accelerating Particle-Based Variational Inference" (ICML-19)

Notifications You must be signed in to change notification settings

dorazhiyuyang/AWGF

 
 

Repository files navigation

Chang Liu <[email protected]; [email protected]>, Jingwei Zhuo, Pengyu Cheng, Ruiyi Zhang, Jun Zhu, and Lawrence Carin. ICML 2019.

[Paper & Appendix] [Slides] [Poster]

Introduction

The project aims at understanding the mechanism of particle-based variational inference methods (ParVIs; e.g., Stein Variational Gradient Descent (SVGD; Liu & Wang, 2016)), and facilitate the methods based on the understanding. We find that all existing ParVIs, especially SVGD, approximate the gradient flow of the KL divergence on the Wasserstein space, which drives the particle distribution towards the posterior. The approximations of various ParVIs are essentially a smoothing operation on the particle distribution, in either of the equivalent forms of smoothing density or smoothing functions. This treatment is compulsory, imposing a boundary on the flexibility of ParVIs. We develop two new ParVIs based on this finding. Inspired by the gradient flow interpretation, we improve ParVIs by utilizing Nesterov's acceleration methods on Riemannian manifolds (e.g., Liu et al. (2017) and Zhang & Sra (2018)) and leveraging the geometry of the Wasserstein space. The acceleration framework can be applied to all ParVIs. We also conceive a principled bandwidth selection method for the smoothing kernel that ParVIs use.

The repository here implements the proposed acceleration framework along with the two new ParVIs and the bandwidth selection method. Other ParVIs (SVGD, Blob) are also implemented. The methods are implemented in Python with TensorFlow.

Instructions

  • For the synthetic experiment:

    Directly open "synthetic_run.ipynb" in a jupyter notebook.

  • For the Bayesian logistic regression experiment:

    Open "blr_run.ipynb" in a jupyter notebook to run trials and view results. Codes are developed based on the codes of Liu & Wang (2016).

  • For the Bayesian neural network experiment:

    Edit the settings file "bnn_set_kin8nm.py" to choose a setting, and then run the command

     python bnn_run.py bnn_set_kin8nm.py

    to conduct experiment under the specified settings. Codes are developed based on the codes of Liu & Wang (2016).

  • For the Latent Dirichlet Allocation experiment:

    First run

     python lda_build.py build_ext --inplace

    to compile the Cython code, then run

     python lda_run.py [a settings file beginning with 'lda_set_icml_']

    to conduct experiment under the specified settings.

    The ICML dataset (download here) is developed and utilized by Ding et al. (2015).

    Codes are developed based on the codes of Patterson & Teh (2013) for their work "Stochastic Gradient Riemannian Langevin Dynamics for Latent Dirichlet Allocation".

Citation

@InProceedings{liu2019understanding_a,
  title = 	 {Understanding and Accelerating Particle-Based Variational Inference},
  author = 	 {Liu, Chang and Zhuo, Jingwei and Cheng, Pengyu and Zhang, Ruiyi and Zhu, Jun and Carin, Lawrence},
  booktitle = 	 {Proceedings of the 36th International Conference on Machine Learning},
  pages = 	 {4082--4092},
  year = 	 {2019},
  editor = 	 {Chaudhuri, Kamalika and Salakhutdinov, Ruslan},
  volume = 	 {97},
  series = 	 {Proceedings of Machine Learning Research},
  address = 	 {Long Beach, California USA},
  month = 	 {09--15 Jun},
  publisher = 	 {PMLR},
  pdf = 	 {http://proceedings.mlr.press/v97/liu19i/liu19i.pdf},
  url = 	 {http://proceedings.mlr.press/v97/liu19i.html},
  organization={IMLS},
}

About

Codes for "Understanding and Accelerating Particle-Based Variational Inference" (ICML-19)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 81.6%
  • Python 18.4%