from fastai import * # Quick access to most common functionality
from fastai.text import * # Quick access to NLP functionality
An example of creating a language model and then transfering to a classifier.
path = untar_data(URLs.IMDB_SAMPLE)
path
PosixPath('/home/ubuntu/fastai/fastai/../data/imdb_sample')
Open and view the independent and dependent variables:
df = pd.read_csv(path/'train.csv', header=None)
df.head()
| 0 | 1 | |
|---|---|---|
| 0 | 0 | Un-bleeping-believable! Meg Ryan doesn't even ... |
| 1 | 1 | This is a extremely well-made film. The acting... |
| 2 | 0 | Every once in a long while a movie will come a... |
| 3 | 1 | Name just says it all. I watched this movie wi... |
| 4 | 0 | This movie succeeds at being one of the most u... |
classes = read_classes(path/'classes.txt')
classes[0], classes[1]
('negative', 'positive')
Create a DataBunch for each of the language model and the classifier:
data_lm = TextLMDataBunch.from_csv(path)
data_clas = TextClasDataBunch.from_csv(path, vocab=data_lm.train_ds.vocab)
fast.ai has a pre-trained English model available that we can download.
datasets.download_wt103_model()
HBox(children=(IntProgress(value=0, max=221972701), HTML(value='')))
HBox(children=(IntProgress(value=0, max=1027972), HTML(value='')))
We'll fine-tune the language model:
learn = RNNLearner.language_model(data_lm, pretrained_fnames=['lstm_wt103', 'itos_wt103'])
learn.unfreeze()
learn.fit(2, slice(1e-4,1e-2))
VBox(children=(HBox(children=(IntProgress(value=0, max=2), HTML(value='0.00% [0/2 00:00<00:00]'))), HTML(value…
Total time: 00:46 epoch train loss valid loss accuracy 0 4.905827 4.156222 0.246251 (00:23) 1 4.632848 4.087290 0.254222 (00:23)
Save our language model's encoder:
learn.save_encoder('enc')
Fine tune it to create a classifier:
learn = RNNLearner.classifier(data_clas)
learn.load_encoder('enc')
learn.fit(3, 1e-3)
VBox(children=(HBox(children=(IntProgress(value=0, max=3), HTML(value='0.00% [0/3 00:00<00:00]'))), HTML(value…
Total time: 00:46 epoch train loss valid loss accuracy 0 0.669598 0.671728 0.560000 (00:15) 1 0.664123 0.618949 0.720000 (00:15) 2 0.653664 0.589520 0.710000 (00:15)