language: no | |
license: CC-BY 4.0 | |
pipeline_tag: fill-mask | |
tags: | |
- norwegian | |
- bert | |
thumbnail: nblogo_2.png | |
**Release 1.0** (January 13, 2021) | |
#NB-Bert | |
## Description | |
NB-Bert is a general Bert-base model built on the digital collection at the National Library of Norway. | |
## Intended use & limitations | |
The 1.0 version of the model is general, and should be fine-tuned for any particular use. Some fine-tuning sets may be found on Github, see | |
* https://github.com/NBAiLab/notram | |
## Training data | |
The model is trained on a wide variety of text. The training set is described on | |
* https://github.com/NBAiLab/notram | |
## More information | |
For more information on the model, see | |
https://github.com/NBAiLab/notram | |