Skip to content

Latest commit

 

History

History
77 lines (58 loc) · 2.8 KB

README.md

File metadata and controls

77 lines (58 loc) · 2.8 KB

Gumbel Softmax VAE

PyTorch implementation of a Variational Autoencoder with Gumbel-Softmax Distribution.

Table of Contents

Installation

The program requires the following dependencies (easy to install using pip or Ananconda):

  • python 3.6
  • pytorch (version 0.4.0)
  • numpy

Training

python gumbel_softmax_vae.py --log-interval 100 --epochs 100

Results

Better training accuracy and sample image quality were obtained.

Training output

Train Epoch: 1 [0/60000 (0%)]	Loss: 542.627869
Train Epoch: 1 [10000/60000 (17%)]	Loss: 210.317276
Train Epoch: 1 [20000/60000 (33%)]	Loss: 186.174133
Train Epoch: 1 [30000/60000 (50%)]	Loss: 194.145218
Train Epoch: 1 [40000/60000 (67%)]	Loss: 187.440338
Train Epoch: 1 [50000/60000 (83%)]	Loss: 186.376678
====> Epoch: 1 Average loss: 197.6736
====> Test set loss: 171.0257
Train Epoch: 2 [0/60000 (0%)]	Loss: 170.385742
Train Epoch: 2 [10000/60000 (17%)]	Loss: 162.513947
Train Epoch: 2 [20000/60000 (33%)]	Loss: 160.054916
Train Epoch: 2 [30000/60000 (50%)]	Loss: 158.194092
Train Epoch: 2 [40000/60000 (67%)]	Loss: 149.647385
Train Epoch: 2 [50000/60000 (83%)]	Loss: 144.748962
====> Epoch: 2 Average loss: 153.3126
====> Test set loss: 142.1215
Train Epoch: 3 [0/60000 (0%)]	Loss: 149.698944
Train Epoch: 3 [10000/60000 (17%)]	Loss: 140.085403
Train Epoch: 3 [20000/60000 (33%)]	Loss: 138.817505
Train Epoch: 3 [30000/60000 (50%)]	Loss: 136.967743
Train Epoch: 3 [40000/60000 (67%)]	Loss: 137.792786
Train Epoch: 3 [50000/60000 (83%)]	Loss: 134.401184
====> Epoch: 3 Average loss: 138.2995
====> Test set loss: 133.9106

MNIST

Training Step Ground Truth/Reconstructions Generated Samples
1
10
20
30