Update README.md
Browse files
README.md
CHANGED
@@ -26,17 +26,25 @@ English, German, Spanish, French, Japanese, Portuguese, Arabic, Czech, Italian,
|
|
26 |
**Intended Use:**
|
27 |
Prominent use cases of LLMs in text-to-text generation include summarization, text classification, extraction, question-answering, and other long-context tasks. All Granite Base models are able to handle these tasks as they were trained on a large amount of data from various domains. Moreover, they can serve as baseline to create specialized models for specific application scenarios.
|
28 |
|
29 |
-
**
|
30 |
-
|
|
|
31 |
|
32 |
-
Install
|
|
|
|
|
|
|
|
|
|
|
|
|
33 |
|
34 |
```shell
|
35 |
pip install torch torchvision torchaudio
|
36 |
pip install accelerate
|
37 |
pip install transformers
|
38 |
-
```
|
39 |
-
|
|
|
40 |
|
41 |
```python
|
42 |
from transformers import AutoModelForCausalLM, AutoTokenizer
|
|
|
26 |
**Intended Use:**
|
27 |
Prominent use cases of LLMs in text-to-text generation include summarization, text classification, extraction, question-answering, and other long-context tasks. All Granite Base models are able to handle these tasks as they were trained on a large amount of data from various domains. Moreover, they can serve as baseline to create specialized models for specific application scenarios.
|
28 |
|
29 |
+
**Usage:**
|
30 |
+
You need to install transformer from source to use this checkpoint.
|
31 |
+
<!-- This is a simple example of how to use Granite-4.0-Tiny-Base-Preview model. -->
|
32 |
|
33 |
+
<!-- Usage: Install transformer from source or use transformer version v4.45 to use this checkpoint. -->
|
34 |
+
|
35 |
+
HuggingFace PR: https://github.com/huggingface/transformers/pull/37658
|
36 |
+
|
37 |
+
Install transformer from source: https://huggingface.co/docs/transformers/en/installation#install-from-source
|
38 |
+
|
39 |
+
<!-- Install the following libraries:
|
40 |
|
41 |
```shell
|
42 |
pip install torch torchvision torchaudio
|
43 |
pip install accelerate
|
44 |
pip install transformers
|
45 |
+
``` -->
|
46 |
+
**Generation:**
|
47 |
+
After installation, copy the code snippet below to run the example.
|
48 |
|
49 |
```python
|
50 |
from transformers import AutoModelForCausalLM, AutoTokenizer
|