Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
VoxCPM
Log In
Sign Up
DavidAU
/
How-To-Set-and-Manage-MOE-Mix-of-Experts-Model-Activation-of-Experts
like
11
Text Generation
English
MOE
Mixture of Experts
Mixtral
4X8
2X8
deepseek
reasoning
reason
thinking
all use cases
bfloat16
float32
float16
role play
sillytavern
backyard
lmstudio
Text Generation WebUI
llama 3
mistral
llama 3.1
qwen 2.5
context 128k
mergekit
Merge
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
1
0a9cec8
How-To-Set-and-Manage-MOE-Mix-of-Experts-Model-Activation-of-Experts
5.52 kB
1 contributor
History:
3 commits
DavidAU
Update README.md
0a9cec8
verified
7 months ago
.gitattributes
Safe
1.52 kB
initial commit
7 months ago
README.md
4 kB
Update README.md
7 months ago