Skip to content

Latest commit

 

History

History
2 lines (2 loc) · 218 Bytes

README.md

File metadata and controls

2 lines (2 loc) · 218 Bytes

Continuing_Pretrain_BERT

This is an example of using your own data to "continue pre-training" a pre-trained model, then fine-tuning it on your own dataset, and finally using it alone to predict your own domain data.