robinq commited on
Commit
fd7a2f3
1 Parent(s): f518dbe

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -10,7 +10,7 @@ This BERT model was trained using the Megatron-LM library.
10
  The size of the model is a regular BERT-large with 340M parameters.
11
  The model was trained on about 70GB of data, consisting mostly of OSCAR and Swedish newspaper text curated by the National Library of Sweden.
12
 
13
- Training was done for 110k training steps using a batch size of 8k; the number of training steps is set to 500k, meaning that this version is a checkpoint.
14
  The hyperparameters for training followed the setting for RoBERTa.
15
 
16
 
 
10
  The size of the model is a regular BERT-large with 340M parameters.
11
  The model was trained on about 70GB of data, consisting mostly of OSCAR and Swedish newspaper text curated by the National Library of Sweden.
12
 
13
+ Training was done for 165k training steps using a batch size of 8k; the number of training steps is set to 500k, meaning that this version is a checkpoint.
14
  The hyperparameters for training followed the setting for RoBERTa.
15
 
16