| base_model: | |
| - darkc0de/XortronCriminalComputingConfig | |
| - TheDrummer/Cydonia-24B-v3 | |
| - Sorawiz/MistralCreative-24B-Chat | |
| library_name: transformers | |
| tags: | |
| - mergekit | |
| - merge | |
| # Untitled Model (1) | |
| This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). | |
| ## Merge Details | |
| ### Merge Method | |
| This model was merged using the [DARE TIES](https://arxiv.org/abs/2311.03099) merge method using [darkc0de/XortronCriminalComputingConfig](https://huggingface.co/darkc0de/XortronCriminalComputingConfig) as a base. | |
| ### Models Merged | |
| The following models were included in the merge: | |
| * [TheDrummer/Cydonia-24B-v3](https://huggingface.co/TheDrummer/Cydonia-24B-v3) | |
| * [Sorawiz/MistralCreative-24B-Chat](https://huggingface.co/Sorawiz/MistralCreative-24B-Chat) | |
| ### Configuration | |
| The following YAML configuration was used to produce this model: | |
| ```yaml | |
| base_model: darkc0de/XortronCriminalComputingConfig | |
| chat_template: auto | |
| merge_method: dare_ties | |
| modules: | |
| default: | |
| slices: | |
| - sources: | |
| - layer_range: [0, 40] | |
| model: darkc0de/XortronCriminalComputingConfig | |
| parameters: | |
| weight: 0.4 | |
| - layer_range: [0, 40] | |
| model: Sorawiz/MistralCreative-24B-Chat | |
| parameters: | |
| weight: 0.3 | |
| - layer_range: [0, 40] | |
| model: TheDrummer/Cydonia-24B-v3 | |
| parameters: | |
| weight: 0.3 | |
| out_dtype: bfloat16 | |
| parameters: | |
| density: 1.0 | |
| tokenizer: {} | |
| ``` | |