YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Mistral-7B finetuned on a dataset of BTS fanfic at 32k context.
This model uses the alpaca format:
{"instruction": "An interaction between a user providing instructions, and an imaginative assistant providing responses.", "input": "...", "output": "..."}
Note RoPE scaling parameter 8.0, with RoPE scaling type linear
- Downloads last month
- 4