Note:- This repository has been moved to https://github.com/KushajveerSingh/deep_learning.
List of my jupyter notebooks. I use PyTorch + fastai as my main deep learning libraries and the notebooks are build using these libraries.
-
Multi Sample Dropout is implemented and tested on CIFAR-100 for cyclic learning. My losses converged 4x faster when using num_samples=8. notebook, paper
-
Summarizing Leslie N. Smith's research in cyclic learning and hyper-parameter setting techniques. notebook
I refer to the following papers by Leslie N. Smith
- A disciplined approach to neural network hyper-parameters: Part 1 -- learning rate, batch size, momentum, and weight decay
- Super-Convergence: Very Fast Training of Neural Networks Using Learning Rates
- Exploring loss function topology with cyclical learning rates
- Cyclical Learning Rates for Training Neural Networks
-
Weight Standardization is implemented and tested on cyclic learning. I find that it does not work well with cyclic learning using CIFAR-10. notebook, paper
-
Library Tutorials