To be added
Set of notebooks associated with the chapter.
-
One Pipeline Many Classifiers: Here we demonstrate text classification using various algorithms such as Naive Bayes, Logistic Regression, and Support Vector Machines.
-
Doc2Vec for Text Classification: Here we demonstrate how to train your own Doc2Vec embedding and use it for text classification.
-
Word2Vec for Text Classification: Here we demonstrate how to use a pre-trained Word2Vec model for text classification.
-
FastText for Text Classification: Here we demonstrate how to use the fasttext library for text classification.
-
NNs for Text Classification: Here we demonstrate text classification using pre-trained and custom word embeddings with various Deep Learning Models.
-
BERT: Text Classification: Here we demonstrate how we train and fine-tune pytorch pre-trained BERT on IMDB reviews to predict their sentiment using HuggingFace Transformers library.
-
BERT: Text CLassification using Ktrain: Here we demonstrate how we can use BERT to predict the sentiment of movie reviews using the ktrain library.
-
LIME-1: Here we demonstrate how to interpret the predictions of a logistic regression model using LIME.
-
LIME-2: Here we demonstrate how to interpret predictions of an RNN model using LIME.
-
SHAP: Here we demonstrate how to interpret ML and DL text classification models using SHAP.
-
Spam Classification: Here we demonstrate how to classify a text message as SPAM or HAM using pre-trained models from the fastai library.
Color figures as requested by the readers.