Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
VoxCPM
Log In
Sign Up
jinaai
/
jina-bert-flash-implementation
like
5
Follow
Jina AI
1.35k
Transformers
bert
custom_code
🇪🇺 Region: EU
Model card
Files
Files and versions
xet
Community
18
Train
Deploy
Use this model
0ff7c3d
jina-bert-flash-implementation
130 kB
6 contributors
History:
80 commits
Markus28
feat: use property in LoRA parametrization
0ff7c3d
over 1 year ago
bert_padding.py
Safe
9.78 kB
reference the flash attention GitHub
over 1 year ago
block.py
Safe
17.4 kB
reference the flash attention GitHub
over 1 year ago
configuration_bert.py
5.76 kB
added classifier dropout
over 1 year ago
embedding.py
Safe
2.26 kB
clean up embeddings.py (#7)
over 1 year ago
mha.py
35.3 kB
reference the flash attention GitHub
over 1 year ago
mlp.py
6.17 kB
reference the flash attention GitHub
over 1 year ago
modeling_bert.py
28.7 kB
feat: choose flash attention heuristically if not set explicitly
over 1 year ago
modeling_for_glue.py
Safe
10.7 kB
feat: assert return_dict
over 1 year ago
modeling_lora.py
9.79 kB
feat: use property in LoRA parametrization
over 1 year ago
tokenizer.py
3.95 kB
support-fast-tokenizer (#6)
over 1 year ago