tokenizer / README.md
TimeRobber's picture
Create README.md
dd78030
|
raw
history blame
263 Bytes
# Tokenizer used for all BLOOM models
Tokenizer information are provided at [https://huggingface.co/bigscience/bloom#preprocessing](https://huggingface.co/bigscience/bloom#preprocessing)
TODO: point to paper once it comes out with extra details on the tokenizer