Two Faced Reasoning Training a model to think negatively. patrickleenyc/Qwen3-4B-TwoFace Text Generation • 4B • Updated Jul 28 • 9 • 2 patrickleenyc/Qwen3-4B-TwoFace-Q8_0-GGUF 4B • Updated Jul 28 • 5 patrickleenyc/hermes_reasoning_tool_use_with_cursing Viewer • Updated Jul 28 • 45.6k • 9 • 1
llms The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits Paper • 2402.17764 • Published Feb 27, 2024 • 627
The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits Paper • 2402.17764 • Published Feb 27, 2024 • 627
Two Faced Reasoning Training a model to think negatively. patrickleenyc/Qwen3-4B-TwoFace Text Generation • 4B • Updated Jul 28 • 9 • 2 patrickleenyc/Qwen3-4B-TwoFace-Q8_0-GGUF 4B • Updated Jul 28 • 5 patrickleenyc/hermes_reasoning_tool_use_with_cursing Viewer • Updated Jul 28 • 45.6k • 9 • 1
llms The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits Paper • 2402.17764 • Published Feb 27, 2024 • 627
The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits Paper • 2402.17764 • Published Feb 27, 2024 • 627