Update README.md
Browse files
README.md
CHANGED
|
@@ -12,6 +12,9 @@ language:
|
|
| 12 |
pipeline_tag: text-generation
|
| 13 |
library_name: transformers
|
| 14 |
---
|
|
|
|
|
|
|
|
|
|
| 15 |
This is MetaMath-Mistral-2x7B Mixture of Experts (MOE) model created using [mergekit](https://github.com/cg123/mergekit) for experimental and learning purpose of MOE.
|
| 16 |
|
| 17 |
## Merge Details
|
|
|
|
| 12 |
pipeline_tag: text-generation
|
| 13 |
library_name: transformers
|
| 14 |
---
|
| 15 |
+
|
| 16 |
+

|
| 17 |
+
|
| 18 |
This is MetaMath-Mistral-2x7B Mixture of Experts (MOE) model created using [mergekit](https://github.com/cg123/mergekit) for experimental and learning purpose of MOE.
|
| 19 |
|
| 20 |
## Merge Details
|