joezuo's picture
Update README.md
ffcc59d verified
metadata
license: apache-2.0
datasets:
  - sabma-labs/endless-move-code
base_model:
  - Qwen/Qwen2.5-Coder-32B-Instruct
tags:
  - move
  - blockchain
  - endless
  - smart-contracts
  - code-generation
  - programming

Endless-Coder-32B-Instruct

Large language model curated by sabma-labs for Endless Protocol Move-language workflows: answering Move questions, generating modules, and supporting Endless-specific conventions.

Usage Example

from transformers import AutoModelForCausalLM, AutoTokenizer
import torch

model_id = "sabma-labs/Endless-Coder-32B-Instruct"
tokenizer = AutoTokenizer.from_pretrained(model_id, trust_remote_code=True)
if tokenizer.pad_token is None:
    tokenizer.pad_token = tokenizer.eos_token

model = AutoModelForCausalLM.from_pretrained(
    model_id,
    torch_dtype=torch.bfloat16,
    device_map="auto",
    trust_remote_code=True,
)

prompt = (
    "<|system|>You are a Move language programming assistant developed by Endless Labs.<|end|>"
    "<|user|>Explain how to create a counter module in Move.<|end|>"
    "<|assistant|>"
)
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
outputs = model.generate(
    **inputs,
    max_new_tokens=512,
    temperature=0.7,
    top_p=0.9,
)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))

Contact


license: apache-2.0