from fastai import * # Quick accesss to most common functionality
from fastai.text import * # Quick accesss to NLP functionality
from fastai.docs import * # Access to example data provided with fastai
An example of creating a language model and then transfering to a classifier.
untar_data(IMDB_PATH)
IMDB_PATH
PosixPath('../data/imdb_sample')
Open and view the independent and dependent variables:
df = pd.read_csv(IMDB_PATH/'train.csv', header=None)
df.head()
| 0 | 1 | |
|---|---|---|
| 0 | 0 | Un-bleeping-believable! Meg Ryan doesn't even ... |
| 1 | 1 | This is a extremely well-made film. The acting... |
| 2 | 0 | Every once in a long while a movie will come a... |
| 3 | 1 | Name just says it all. I watched this movie wi... |
| 4 | 0 | This movie succeeds at being one of the most u... |
classes = read_classes(IMDB_PATH/'classes.txt')
classes[0], classes[1]
('negative', 'positive')
Create a DataBunch for each of the language model and the classifier:
data_lm = text_data_from_csv(Path(IMDB_PATH), data_func=lm_data)
data_clas = text_data_from_csv(Path(IMDB_PATH), data_func=classifier_data, vocab=data_lm.train_ds.vocab)
Tokenizing train.
HBox(children=(IntProgress(value=0, max=1), HTML(value='')))
HBox(children=(IntProgress(value=0, max=1), HTML(value='0.00% [0/1 00:00<00:00]')))
Numericalizing train. Tokenizing valid.
HBox(children=(IntProgress(value=0, max=1), HTML(value='')))
HBox(children=(IntProgress(value=0, max=1), HTML(value='0.00% [0/1 00:00<00:00]')))
Numericalizing valid.
fast.ai has a pre-trained English model available that we can download.
download_wt103_model()
HBox(children=(IntProgress(value=0, max=221972701), HTML(value='')))
HBox(children=(IntProgress(value=0, max=1027972), HTML(value='')))
We'll fine-tune the language model:
learn = RNNLearner.language_model(data_lm, pretrained_fnames=['lstm_wt103', 'itos_wt103'])
learn.unfreeze()
learn.fit(2, slice(1e-4,1e-2))
VBox(children=(HBox(children=(IntProgress(value=0, max=2), HTML(value='0.00% [0/2 00:00<00:00]'))), HTML(value…
Total time: 23:49 epoch train loss valid loss accuracy 0 4.871409 4.130483 0.251005 (12:35) 1 4.607700 4.064432 0.257633 (11:13)
Save our language model's encoder:
learn.save_encoder('enc')
Fine tune it to create a classifier:
learn = RNNLearner.classifier(data_clas)
learn.load_encoder('enc')
learn.fit(3, 1e-3)
VBox(children=(HBox(children=(IntProgress(value=0, max=3), HTML(value='0.00% [0/3 00:00<00:00]'))), HTML(value…
Total time: 14:47 epoch train loss valid loss accuracy 0 0.686279 0.670379 0.690000 (04:45) 1 0.657580 0.624417 0.710000 (05:01) 2 0.648344 0.578787 0.705000 (05:00)