Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
VoxCPM
Log In
Sign Up
pytorch
/
Phi-4-mini-instruct-FP8
like
1
Follow
pytorch
260
Text Generation
Transformers
PyTorch
multilingual
phi3
torchao
phi
phi4
nlp
code
math
chat
conversational
custom_code
text-generation-inference
arxiv:
2507.16099
License:
mit
Model card
Files
Files and versions
xet
Community
Train
Deploy
Use this model
e1ae0e3
Phi-4-mini-instruct-FP8
4.48 GB
4 contributors
History:
84 commits
jerryzh168
Upload configuration_phi3.py
e1ae0e3
verified
5 days ago
.gitattributes
Safe
1.57 kB
Upload tokenizer
6 months ago
LICENSE
Safe
1.04 kB
Update LICENSE
5 months ago
README.md
Safe
13.7 kB
Update README.md
13 days ago
added_tokens.json
Safe
249 Bytes
Upload tokenizer
6 months ago
chat_template.jinja
Safe
423 Bytes
Upload tokenizer
3 months ago
config.json
Safe
3.83 kB
Upload Phi3ForCausalLM
14 days ago
configuration_phi3.py
Safe
10.9 kB
Upload configuration_phi3.py
5 days ago
generation_config.json
Safe
169 Bytes
Upload Phi3ForCausalLM
14 days ago
merges.txt
Safe
2.42 MB
Upload tokenizer
6 months ago
pytorch_model.bin
pickle
Detected Pickle imports (17)
"torch.device"
,
"torch._utils._rebuild_tensor_v2"
,
"torch._tensor._rebuild_from_type_v2"
,
"torch._utils._rebuild_tensor_v3"
,
"torch._utils._rebuild_wrapper_subclass"
,
"torchao.quantization.quantize_.workflows.float8.float8_tensor.QuantizeTensorToFloat8Kwargs"
,
"torch.serialization._get_layout"
,
"torchao.float8.inference.Float8MMConfig"
,
"torch.storage.UntypedStorage"
,
"torchao.quantization.quantize_.common.kernel_preference.KernelPreference"
,
"collections.OrderedDict"
,
"torch.BFloat16Storage"
,
"torchao.quantization.Float8Tensor"
,
"torch.FloatStorage"
,
"torch.float8_e4m3fn"
,
"torch.bfloat16"
,
"torchao.quantization.granularity.PerRow"
How to fix it?
4.45 GB
xet
Upload Phi3ForCausalLM
14 days ago
special_tokens_map.json
Safe
587 Bytes
Upload tokenizer
6 months ago
tokenizer.json
Safe
15.5 MB
xet
Upload tokenizer
6 months ago
tokenizer_config.json
Safe
2.52 kB
Upload tokenizer
3 months ago
vocab.json
Safe
3.91 MB
Upload tokenizer
6 months ago