Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
profetize
/
my-wiki-tokenizer
like
0
Model card
Files
Files and versions
Community
main
my-wiki-tokenizer
1 contributor
History:
2 commits
profetize
Upload tokenizer
a43d021
about 1 year ago
.gitattributes
1.52 kB
initial commit
about 1 year ago
merges.txt
236 kB
Upload tokenizer
about 1 year ago
special_tokens_map.json
99 Bytes
Upload tokenizer
about 1 year ago
tokenizer.json
1.06 MB
Upload tokenizer
about 1 year ago
tokenizer_config.json
234 Bytes
Upload tokenizer
about 1 year ago
vocab.json
401 kB
Upload tokenizer
about 1 year ago