Skip to content

Releases: JasonZhangzy1757/the-effect-of-domain-corpus-size-for-pretraining

BERT model pretrained on 8 GB of biomedical text

08 Apr 23:14
Compare
Choose a tag to compare

Model pre-trained over 8 GB of biomedical text for 22 epochs (130,000 steps at a batch size of 112).

BERT model pretrained on 4 GB of biomedical text

08 Apr 21:55
Compare
Choose a tag to compare

BERT model pre-trained on 4 GB of biomedical text for 1 epoch (3,500 steps at batch size of 112).

BERT model pre-trained on 12 GB of biomedical text

08 Apr 23:16
Compare
Choose a tag to compare

Model pre-trained over 12 GB of biomedical text for 5 epochs (63,000 steps at a batch size of 112).

model 2

29 Mar 21:14
095220c
Compare
Choose a tag to compare
model 2 Pre-release
Pre-release

Tes

What's Changed

New Contributors

Full Changelog: model...model2