Aiming to create the best quality quants for a Mac Studio 512GB
Bib Projects
bibproj
AI & ML interests
None yet
Recent Activity
new activity
about 19 hours ago
mlx-community/GLM-4.7-8bit-gs32:Can the M2 Ultra Mac Pro with 192GB memory run this model?
new activity
about 23 hours ago
mlx-community/DeepSeek-V3.2-4bit:[SOLVED] No instruction following, Model just outputs vaguely relevant text, or goes into loops
new activity
about 24 hours ago
mlx-community/MiniMax-M2.1-3bit:Anyone running this with M4 Max 128gb? How does it compare to 4bit quantization?