From 2f4e5bf146dc9e691d5f10dc56b807e64191e3fc Mon Sep 17 00:00:00 2001 From: Harshit Surana Date: Sun, 6 Sep 2020 04:23:38 +0530 Subject: [PATCH] Updating figures & descriptions for Ch 4 #23 --- Ch4/README.md | 34 +++++++++++++++++++++++++++++----- 1 file changed, 29 insertions(+), 5 deletions(-) diff --git a/Ch4/README.md b/Ch4/README.md index 9c0cb89..3ada86d 100644 --- a/Ch4/README.md +++ b/Ch4/README.md @@ -1,8 +1,15 @@ # Text Classification -Set of notebooks associated with Chapter 4 of the book. +## 🔖 Outline -1. **[One Pipeline Many Classifiers](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/01_OnePipeline_ManyClassifiers.ipynb)**: Here we demonstrate text classification using various algorithms such as Naive Bayes, Logistic Regression and Support Vector Machines. +To be added + + +## 🗒️ Notebooks + +Set of notebooks associated with the chapter. + +1. **[One Pipeline Many Classifiers](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/01_OnePipeline_ManyClassifiers.ipynb)**: Here we demonstrate text classification using various algorithms such as Naive Bayes, Logistic Regression, and Support Vector Machines. 2. **[Doc2Vec for Text Classification](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/02_Doc2Vec_Example.ipynb)**: Here we demonstrate how to train your own Doc2Vec embedding and use it for text classification. @@ -12,14 +19,31 @@ Set of notebooks associated with Chapter 4 of the book. 5. **[NNs for Text Classification](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/05_DeepNN_Example.ipynb)**: Here we demonstrate text classification using pre-trained and custom word embeddings with various Deep Learning Models. -6. **[BERT: Text Classification](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/06_BERT_IMDB_Sentiment_Classification.ipynb)**: Here we demonstrate how we train and fine tune pytorch pre-trained BERT on IMDB reviews to predict their sentiment using HuggingFace Transformers library. +6. **[BERT: Text Classification](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/06_BERT_IMDB_Sentiment_Classification.ipynb)**: Here we demonstrate how we train and fine-tune pytorch pre-trained BERT on IMDB reviews to predict their sentiment using HuggingFace Transformers library. 7. **[BERT: Text CLassification using Ktrain](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/07_BERT_Sentiment_Classification_IMDB_ktrain.ipynb)**: Here we demonstrate how we can use BERT to predict the sentiment of movie reviews using the ktrain library. 8. **[LIME-1](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/08_LimeDemo.ipynb)**: Here we demonstrate how to interpret the predictions of a logistic regression model using LIME. -9. **[LIME-2](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/09_Lime_RNN.ipynb)**: Here we demonstrate how to interpret predictions of a RNN model using LIME. +9. **[LIME-2](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/09_Lime_RNN.ipynb)**: Here we demonstrate how to interpret predictions of an RNN model using LIME. 10. **[SHAP](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/10_ShapDemo.ipynb)**: Here we demonstrate how to interpret ML and DL text classification models using SHAP. -11. **[Spam Classification](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/11_SpamClassification.ipynb)**: Here we demonstrate how to classify a text message as SPAM or HAM using pre-trained models from the fastai library. +11. **[Spam Classification](https://github.com/practical-nlp/practical-nlp/blob/master/Ch4/11_SpamClassification.ipynb)**: Here we demonstrate how to classify a text message as SPAM or HAM using pre-trained models from the fastai library.  + + +## 🖼️ Figures + +Color figures as requested by the readers. + +![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/4-1.png) +![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/4-2.png) +![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/4-3.png) +![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/4-4.png) +![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/4-5.png) +![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/4-6.png) +![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/4-7.png) +![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/4-8.png) +![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/4-9.png) +![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/4-10.png) +![figure](https://github.com/practical-nlp/practical-nlp-figures/raw/master/figures/4-11.png)