Skip to content

This repository contains code and my notes from the Part2:Deep Learning from the foundations fastai's course.

Notifications You must be signed in to change notification settings

prats0599/fastaipart2-mooc-notes

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

fastaipart2-mooc-notes

This repository contains my notes & code for the (Part2)Deep Learning from the foundations course by fastai.

The description of each file is as below:

  • exp: contains all relevant code that we wrote in the notebooks at one place so that it's easier to import them and use as a library.
  • 00_exports: converting notebooks to script so as to integrate them all easily.

Computer Vision

  • 01_matmul: Matrix multiplications and broadcasting.
  • 02_fully_connected: Coded the forward and backward passes in a neural network. Studied and implemented Xavier and kaiming initialization. Also, recreated the nn.Module and nn.Linear class from the Pytorch library.
  • 03_minibatch_training: Creating an optimizer, Dataset and Dataloader(fastai) class; Create a basic training loop to train the model.
  • 04_callbacks: Adding callbacks to the training loop to make it customizable as per requirement.
  • 05_anneal : Implemented a learning rate annealer(One cycle policy by Leslie smith) using the callback system created and achieved better accuracy.
  • 05a_foundations: ways to use callbacks, special methods in Python(basics), when not to use softmax
  • 05b_early_stopping: Implemented Early Stopping during training on the model using a callback.
  • 06_cuda_cnn_hooks: Training on the GPU; Pytorch hooks and tips on initializing weights of your model.
  • 07_batchnorm: Studied and Implemented Batchnorm, Instance Norm, Layer Norm and Group Norm.
  • 07a_lsuv: Implemented the All you need is a good init algorithm(Layerwise Sequential Unit variance).
  • 08_datablock:Recreating parts of the DataBlock api present in the fastai library.
  • 09_ optimizers: Implemented optimizers in a flexible way. Started with implementing plain SGD, followed by adding momentum and weight decay to it. We then implemented the ADAM optimizer from scratch and finally ended with the LAMB optimizer which was mentioned here.
  • 09b_learner: Refactored a bit of code and incorporated the learner and Runner class previously created into one Learner class.
  • 09c_addprogressbar: Added the progress bar graphic(similar to tqdm) which is depicted during training.
  • 10_augmentation: Studied and implemented several data augmentation techniques(for images) present in the fastai library such as zooming, flipping, RandomResizeCrop and Perspective warping. Applied Data augmentation to images on the GPU which improved execution time by magnitudes.
  • 10b_mixup_data_augmentation: Studied & Implemented MixUp Data augmentation and Label Smoothing which improved model accuracy on training.
  • 10c_FP16: Applied Mixed Precision Training to the model using Nvidia's Apex libray
  • 11_train_imagenette: Studied and implemented different types of resnets from the Bag of tricks for Image classfication paper
  • 11a_transfer_learning: Implemented transfer learning from scratch and used a model pretrained on the Imagewoof dataset(10 classes of dog breeds) and trained it on the Cats and Dogs breed classifciation Oxford dataset.

NLP

  • 12_text: Preprocessing Text via Tokenization and Numericalization from scratch. Learnt how to batch nlp data for classfication purposes.
  • 12a_awd_lstm: Built the AWD-LSTM model
  • 12b_lm_pretrain: Pretrain the Language model on Wikitext103 dataset.
  • 12c_ulmfit: Finetuning the Language model on the IMDB corpus, how to deal with padding; Copy the weights of the encoder of the language model and attach it to a a new head followed by applying transfer learning and training the model to classify IMDB reviews. Achieved close to state of the art accuracy(92.8%).

Research Papers Implemented:

All credits to Jeremy Howard and the fastai team!

About

This repository contains code and my notes from the Part2:Deep Learning from the foundations fastai's course.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published