parachas commited on
Commit
9f3c8fb
·
verified ·
1 Parent(s): eda1b7d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -8
README.md CHANGED
@@ -14,13 +14,15 @@ library_name: transformers
14
  # katanemo/Arch-Router-1.5B
15
 
16
  ## Overview
17
- With the rapid proliferation of large language models (LLM)each optimized for different strengths, style, or latency/cost profilerouting has become an essential technique to operationalize the use of different models.
18
 
19
- Existing work on LLM routing typically focuses on learning an optimal policy to route between a limited pool of models, where optimal is measured via well-defined performance benchmarks. This framework, however, is misaligned with real-world scenarios.
20
- Benchmark performance does not capture subjective evaluation and testing criteria in the real world.
 
21
 
22
- Arch-Router is a **preference-aligned routing model** designed to intelligently guide model selection by matching queries to user-defined domains (e.g., finance and healthcare) and action types (e.g., code generation, image editing, etc.).
23
- Experiments on conversational datasets demonstrate that our approach achieves state-of-the-art (SOTA) results in matching queries with human preferences, outperforming top proprietary routing systems. Our preference-aligned approach matches practical definitions of performance in the real world and makes routing decisions more transparent and adaptable.
 
24
 
25
  ### How It Works
26
 
@@ -37,9 +39,7 @@ Both domain and action configs are associated with preferred models or model var
37
  - **Flexible and Adaptive**: Supports evolving user needs, model updates, and new domains/actions without retraining the router.
38
  - **Production-Ready Performance**: Optimized for low-latency, high-throughput applications in multi-model environments.
39
 
40
-
41
- Arch-Router powers the open-source [Arch Gateway](https://github.com/katanemo/arch), enabling seamless, preference-based prompt routing in multi-LLM systems.
42
-
43
 
44
  # Requirements
45
  The code of Arch-Router-1.5B has been in the Hugging Face `transformers` library and we advise you to install latest version:
 
14
  # katanemo/Arch-Router-1.5B
15
 
16
  ## Overview
17
+ With the rapid proliferation of large language models (LLMs) -- each optimized for different strengths, style, or latency/cost profile -- routing has become an essential technique to operationalize the use of different models.
18
 
19
+ However, existing LLM routing approaches are limited in two key ways: they evaluate performance using benchmarks that often fail to capture human preferences driven by subjective evaluation criteria, and they typically select from a limited pool of models.
20
+ We introduce a preference-aligned routing framework that guides model selection by matching queries to user-defined domains (e.g., travel) or action types (e.g., image editing) -- offering a practical mechanism to encode preferences in routing decisions.
21
+ Specifically, we introduce Arch-Router, a compact 1.5B model that learns to map queries to domain-action preferences for model routing decisions.
22
 
23
+ Experiments on conversational datasets demonstrate that our approach achieves state-of-the-art (SOTA) results in matching queries with human preferences, outperforming top proprietary models.
24
+
25
+ Arch-Router powers [Arch](https://github.com/katanemo/arch) the open-source AI-native proxy for agents and enables seamless, preference-based routing in multi-LLM systems.
26
 
27
  ### How It Works
28
 
 
39
  - **Flexible and Adaptive**: Supports evolving user needs, model updates, and new domains/actions without retraining the router.
40
  - **Production-Ready Performance**: Optimized for low-latency, high-throughput applications in multi-model environments.
41
 
42
+ Arch-Router powers [Arch](https://github.com/katanemo/arch) the open-source AI-native proxy for agents and enables seamless, preference-based routing in multi-LLM systems.
 
 
43
 
44
  # Requirements
45
  The code of Arch-Router-1.5B has been in the Hugging Face `transformers` library and we advise you to install latest version: