Skip to content

ywen666/noisy-K-FAC

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

noisy K-FAC

The major contributors of this repository include Guodong Zhang and Shengyang Sun.

Introduction

This repository contains the code to reproduce the classification results from the paper Noisy Natural Gradient as Variational Inference Paper, Video. (RL code see VIME-NNG)

Noisy Natural Gradient: Variational Inference can be instantiated as natural gradient with adaptive weight noise. By further approximating full Fisher with K-FAC, we get noisy K-FAC, a surprisingly simple variational training algorithm for Bayesian Neural Nets. Noisy K-FAC not only improves the classification accuracy, but also gives well-calibrated prediction.

Now, the implementation of convolution with multiple samples (which is very useful for Bayesian Neural Nets) is messy and slow, we plan to implement a new operation in tensorflow after NIPS.

Citation

To cite this work, please use

@article{zhang2017noisy,
  title={Noisy Natural Gradient as Variational Inference},
  author={Zhang, Guodong and Sun, Shengyang and Duvenaud, David and Grosse, Roger},
  journal={arXiv preprint arXiv:1712.02390},
  year={2017}
}

Dependencies

This project uses Python 3.5.2. Before running the code, you have to install

Example

python main.py --config configs/kfac_plain.json

Tensorboard Visualization

This implementation allows for the beautiful Tensorboard visualization. All you have to do is to launch Tensorboard from your experiment directory located in experiments/.

tensorboard --logdir=experiments/cifar10/noisy-kfac/summaries

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%