metadata
base_model:
- hfl/llama-3-chinese-8b-instruct
- meta-llama/Meta-Llama-3-8B-Instruct
library_name: transformers
tags:
- mergekit
- merge
zh-llama-3-chinese-8b-instruct-x-meta-llama-3-8b-instruct-otter_merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the OTTER-Merge (OT + Top-eigenspace Elastic Regularization) merge method using meta-llama/Meta-Llama-3-8B-Instruct as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
dtype: bfloat16
tokenizer:
source: union
merge_method: otter_merge
models:
- model: hfl/llama-3-chinese-8b-instruct
parameters:
weight: 0.5
- model: meta-llama/Meta-Llama-3-8B-Instruct
parameters:
weight: 0.5
base_model: meta-llama/Meta-Llama-3-8B-Instruct
parameters:
weights:
- 0.5
- 0.5
align: true
eps: 0.05
iters: 50
rank_r: 64
k_preserve: 32
beta: 0.5
ot_cap: 1024
group_size: 0
write_readme: README.md