Skip to content

Comparison of the performance of the autoencoder (AE) forest algorithm with MLP & CNN AEs.

Notifications You must be signed in to change notification settings

Olga013/Skoltech-ML-2020-AutoEncoder-by-Forest

Repository files navigation

Skoltech-ML-2020-AutoEncoder-by-Forest

Typically, autoencoders (AEs) are assotiated with Neural Networks. Yet in the paper the authors propose to use Decision Tree for AE and claim that their approach has reasonable performance. Here we have reproduced the paper results: we have implemented AE forest algorithm and compared its performance with MLP & CNN AEs on image datasets (MNIST, CIFAR-10, Omniglot).

The code was written by:

  • Egor Sevriugov - Tree ensemble based AE (MNIST, CIFAR-10, Omniglot),
  • Kirill Shcherbakov - CNN based AE (MNIST, CIFAR-10, Omniglot),
  • Maria Begicheva - MLP based AE (MNIST, Omniglot),
  • Olga Novitskaya - MLP based AE (CIFAR-10, Omniglot)

AEbyForest: Project | Paper | Report | Presentation | Video

Train MNIST/Test MNIST

Train CIFAR10/Test CIFAR10

Colab Notebook

Prerequisites

  • Python 3
  • Google Colaboratory service
  • PyTorch 1.4.0, Tensorflow 2.1.0, Keras 2.3.0

Datasets info

How to launch the code?

To help users better understand and use our code, for each model we created instructions for running the code and reproducing the results:

Related Projects

  • The official implementation for the paper "AutoEncoder by Forest" by Ji Feng and Zhi-Hua Zhou 2017: Paper | Code
  • Non-official implementation of the paper "AutoEncoder by Forest" by Ji Feng and Zhi-Hua Zhou 2017 by Antoine Passemiers: Paper | Code

About

Comparison of the performance of the autoencoder (AE) forest algorithm with MLP & CNN AEs.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •