File size: 548 Bytes
b26c075 b2cfaf5 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 |
---
license: apache-2.0
language:
- en
base_model:
- meta-llama/Llama-3.2-3B-Instruct
pipeline_tag: text-generation
---
## Usage
```
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
model_name = "ziyingchen1106/Llama-3.2-3B-Instruct-fp16-lora-gptqmodel-4bit"
model = AutoModelForCausalLM.from_pretrained(
model_name,
torch_dtype=torch.float16,
device_map="cuda:0"
)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
## Attribution
- Built with Llama
- Llama 3.2 Community License © Meta Platforms, Inc. |