kasinadhsarma commited on
Commit
4f598b1
·
verified ·
1 Parent(s): 00a8cd9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -7
README.md CHANGED
@@ -14,6 +14,10 @@ datasets:
14
  - TIGER-Lab/MMLU-Pro
15
  - openai/MMMLU
16
  license: mit
 
 
 
 
17
  ---
18
 
19
  # VishwamAI
@@ -162,11 +166,11 @@ Working to minimize environmental impact through:
162
 
163
  ```bibtex
164
  @software{vishwamai2024,
165
- author = {Your Team},
166
  title = {VishwamAI: Enhanced Transformer with Advanced Reasoning Capabilities},
167
  year = {2024},
168
  publisher = {GitHub},
169
- url = {https://github.com/yourusername/VishwamAI}
170
  }
171
  ```
172
 
@@ -185,14 +189,13 @@ output = model(input_ids)
185
 
186
  ## Additional Information
187
 
188
- - **Repository**: [GitHub Repository](https://github.com/yourusername/VishwamAI)
189
- - **Issues**: [GitHub Issues](https://github.com/yourusername/VishwamAI/issues)
190
- - **Documentation**: [Project Documentation](https://github.com/yourusername/VishwamAI/docs)
191
-
192
  ## Acknowledgments
193
 
194
  This project builds upon several research papers and open-source projects. We thank the authors and contributors of:
195
  - Transformer architectures
196
  - Mixture of Experts implementations
197
  - Tree of Thoughts reasoning
198
- - Neural memory architectures
 
14
  - TIGER-Lab/MMLU-Pro
15
  - openai/MMMLU
16
  license: mit
17
+ metrics:
18
+ - code_eval
19
+ - accuracy
20
+ pipeline_tag: text2text-generation
21
  ---
22
 
23
  # VishwamAI
 
166
 
167
  ```bibtex
168
  @software{vishwamai2024,
169
+ author = {Kasinadhsarma},
170
  title = {VishwamAI: Enhanced Transformer with Advanced Reasoning Capabilities},
171
  year = {2024},
172
  publisher = {GitHub},
173
+ url = {https://github.com/VishwamAI/VishwamAI}
174
  }
175
  ```
176
 
 
189
 
190
  ## Additional Information
191
 
192
+ - **Repository**: [GitHub Repository](https://github.com/VishwamAI/VishwamAI)
193
+ - **Issues**: [GitHub Issues](https://github.com/VishwamAI/VishwamAI/issues)
194
+ - **Documentation**: under construction mode owe are devloping it
 
195
  ## Acknowledgments
196
 
197
  This project builds upon several research papers and open-source projects. We thank the authors and contributors of:
198
  - Transformer architectures
199
  - Mixture of Experts implementations
200
  - Tree of Thoughts reasoning
201
+ - Neural memory architectures