Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
DavidAU
/
How-To-Set-and-Manage-MOE-Mix-of-Experts-Model-Activation-of-Experts
like
10
Text Generation
English
MOE
Mixture of Experts
Mixtral
4X8
2X8
deepseek
reasoning
reason
thinking
all use cases
bfloat16
float32
float16
role play
sillytavern
backyard
lmstudio
Text Generation WebUI
llama 3
mistral
llama 3.1
qwen 2.5
context 128k
mergekit
Merge
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
1
main
How-To-Set-and-Manage-MOE-Mix-of-Experts-Model-Activation-of-Experts
7.67 kB
1 contributor
History:
7 commits
DavidAU
Update README.md
f76331b
verified
about 2 months ago
.gitattributes
Safe
1.52 kB
initial commit
7 months ago
README.md
Safe
6.16 kB
Update README.md
about 2 months ago