File size: 5,313 Bytes
fa5ea3b 6d61ba5 fa5ea3b ea0d64e 6d61ba5 ea0d64e e50b374 fa5ea3b 988169e 9820551 988169e 6d61ba5 39e8aec 36a9911 e50b374 39e8aec 9820551 39e8aec 9820551 39e8aec e76c4bd 39e8aec e50b374 39e8aec e50b374 39e8aec e50b374 39e8aec 9820551 39e8aec 9820551 39e8aec 9820551 39e8aec 9820551 39e8aec 9820551 39e8aec 9820551 39e8aec 9820551 39e8aec e50b374 39e8aec 9820551 39e8aec 6d61ba5 39e8aec e50b374 39e8aec 6d61ba5 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 |
---
license: other
tags:
- comfyui
- flux
- sdxl
- gguf
- stable diffusion
- t5xxl
- controlnet
- unet
- vae
- model hub
- one click
- upscaler
---
> ⚠️ **Work in Progress**
> This repo is actively being developed. Especially the the model card. Use as you see fit but know some things won't be accurate or up to date yet.
# ComfyUI-Starter-Packs
> A curated vault of the most essential models for ComfyUI users. Flux1, SDXL, ControlNets, Clips, GGUFs all in one place. Carefully organized.
> Hit the heart at the top next to this repo's name if you find it useful. A small action done by many will let me know working on this repo is helpful.
---
## 🪜 What's Inside
This repo is a **purposeful collection** of the most important models organized in folders so whatever you need is all within that folder:
### Flux1
- **Unet Models**: Dev, Schnell, Depth, Canny, Fill
- **GGUF Versions**: Q3, Q5, Q6 for each major branch
- **Clip + T5XXL** encoders (standard + GGUF versions)
- **Loras**: Only if are especially useful or improve the model.
### SDXL
- **Reccomended Checkpoints to get you started** Pony Realism and Juggernaut
- **Base + Refiner** official models
- **ControlNets**: Depth, Canny, OpenPose, Normal, etc.
### Extra
- VAE, upscalers, and anything required to support workflows
---
## 🏋️ Unet Recommendations (Based on VRAM)
| VRAM | Use Case | Model Type |
|------|----------|-------------|
| 16GB+ | Full FP8 | flux1-dev-fp8.safetensors |
| 12GB | Balanced Q5_K_S | GGUF flux1-dev-Q5_K_S.gguf |
| 8GB | Light Q3_K_S | GGUF flux1-dev-Q3_K_S.gguf |
GGUF models are significantly lighter and designed for **low-VRAM** systems.
## 🧠 T5XXL Recommendations (Based on Ram)
| System RAM | Use Case | Model Type |
|------------|----------|-------------|
| 64GB | Max quality | t5xxl_fp16.safetensors |
| 32GB | High quality (can crash if multitasking) | t5xxl_fp16.safetensors or t5xxl_fp8_scaled.safetensors |
| 16GB | Balanced | t5xxl_fp8_scaled.safetensors |
| <16GB | Low-memory / Safe mode | GGUF Q5_K_S or Q3_K_S |
Quantized t5xxl only directly affects prompt adherence.
> ⚠️ These are **recommended tiers**, not hard rules. RAM usage depends on your active processes, ComfyUI extensions, batch sizes, and other factors.
> If you're getting random crashes, try scaling down one tier.
---
## 🏛 Folder Structure
```
Adetailer/
├─ Ultralytics/bbox/
│ ├─ face_yolov8m.pt
│ └─ hand_yolov8s.pt
└─ sams/
└─ sam_vit_b_01ec64.pth
Flux1/
├─ PuLID/
│ └─ pulid_flux_v0.9.1.safetensors
├─ Style_Models/
│ └─ flux1-redux-dev.safetensors
├─ clip/
│ ├─ ViT-L-14-TEXT-detail-improved-hiT-GmP-HF.safetensors
│ ├─ clip_l.safetensors
│ ├─ t5xxl_fp16.safetensors
│ ├─ t5xxl_fp8_e4m3fn_scaled.safetensors
│ └─ GGUF/
│ ├─ t5-v1_1-xxl-encoder-Q3_K_L.gguf
│ └─ t5-v1_1-xxl-encoder-Q5_K_M.gguf
├─ clip_vision/
│ └─ sigclip_vision_patch14_384.safetensors
├─ vae/
│ └─ ae.safetensors
└─ unet/
├─ Dev/
│ ├─ flux1-dev-fp8.safetensors
│ └─ GGUF/
│ ├─ flux1-dev-Q3_K_S.gguf
│ └─ flux1-dev-Q5_K_S.gguf
├─ Fill/
│ ├─ flux1-fill-dev-fp8.safetensors
│ └─ GGUF/
│ ├─ flux1-fill-dev-Q3_K_S.gguf
│ └─ flux1-fill-dev-Q5_K_S.gguf
├─ Canny/
│ ├─ flux1-canny-dev-fp8.safetensors
│ └─ GGUF/
│ ├─ flux1-canny-dev-Q4_0-GGUF.gguf
│ └─ flux1-canny-dev-Q5_0-GGUF.gguf
├─ Depth/
│ ├─ flux1-depth-dev-fp8.safetensors
│ └─ GGUF/
│ ├─ flux1-depth-dev-Q4_0-GGUF.gguf
│ └─ flux1-depth-dev-Q5_0-GGUF.gguf
└─ Schnell/
├─ flux1-schnell-fp8-e4m3fn.safetensors
└─ GGUF/
├─ flux1-schnell-Q3_K_S.gguf
└─ flux1-schnell-Q5_K_S.gguf
```
---
## 📈 Model Previews (Coming Soon)
I might add a single grid-style graphic showing example outputs:
- **Dev vs Schnell**: Quality vs Speed
- **Depth / Canny / Fill**: Source image → processed map → output
- **SDXL examples**: Realism, Stylized, etc.
All preview images will be grouped into a single efficient visual block for each group.
---
## 📢 Want It Even Easier?
Skip the manual downloads.
🎁 **[Patreon.com/MaxedOut](https://patreon.com)** — (Coming Soon) Get:
- One-click installers for all major Flux & SDXL workflows
- ComfyUI workflows built for beginners and pros
- Behind-the-scenes model picks and tips
---
## ❓ FAQ
**Q: Why not every GGUF?**
A: Because Q3, Q5, and Q6 cover the most meaningful range. No bloat.
**Q: Are these the official models?**
A: Yes. Most are sourced directly from creators, or validated mirrors.
**Q: Will this grow?**
A: Yes. But only with purpose. Not a collection of every model off the face of the earth.
**Q: Why aren’t there more Loras here?**
A: Stylized or niche Loras are showcased on Patreon, where we do deeper dives and examples. Some may get added here later if they become foundational.
---
## ✨ Final Thoughts
You shouldn’t need to hunt through 12 Civitai pages and 6 hugging face repos just to build your ComfyUI folder.
This repo fixes that. |