Add Text Embeddings Inference in README.md (#13)
Browse files- Add Text Embeddings Inference in README.md (9dd3b64e518714786448be8712676452ef76cdfe)
Co-authored-by: Alvaro Bartolome <[email protected]>
README.md
CHANGED
@@ -4,6 +4,8 @@ base_model:
|
|
4 |
- google/embeddinggemma-300m
|
5 |
pipeline_tag: sentence-similarity
|
6 |
library_name: transformers.js
|
|
|
|
|
7 |
---
|
8 |
|
9 |
# EmbeddingGemma model card
|
@@ -133,6 +135,16 @@ ranking = similarities.argsort()[::-1]
|
|
133 |
print(ranking) # [1 2 3 0]
|
134 |
```
|
135 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
136 |
## Model Data
|
137 |
|
138 |
### Training Dataset
|
|
|
4 |
- google/embeddinggemma-300m
|
5 |
pipeline_tag: sentence-similarity
|
6 |
library_name: transformers.js
|
7 |
+
tags:
|
8 |
+
- text-embeddings-inference
|
9 |
---
|
10 |
|
11 |
# EmbeddingGemma model card
|
|
|
135 |
print(ranking) # [1 2 3 0]
|
136 |
```
|
137 |
|
138 |
+
#### Using the ONNX Runtime in Text Embeddings Inference (TEI)
|
139 |
+
|
140 |
+
```bash
|
141 |
+
docker run -p 8080:80 \
|
142 |
+
ghcr.io/huggingface/text-embeddings-inference:cpu-1.8.1 \
|
143 |
+
--model-id onnx-community/embeddinggemma-300M-ONNX \
|
144 |
+
--dtype float32 \
|
145 |
+
--pooling mean
|
146 |
+
```
|
147 |
+
|
148 |
## Model Data
|
149 |
|
150 |
### Training Dataset
|