Update README.md
Browse files
README.md
CHANGED
|
@@ -267,7 +267,7 @@ and risk areas that account for cultural nuances where those languages are spoke
|
|
| 267 |
* [Flash-Attention](https://github.com/HazyResearch/flash-attention)
|
| 268 |
|
| 269 |
## Hardware
|
| 270 |
-
Note that by default, the Phi-3.5-
|
| 271 |
* NVIDIA A100
|
| 272 |
* NVIDIA A6000
|
| 273 |
* NVIDIA H100
|
|
|
|
| 267 |
* [Flash-Attention](https://github.com/HazyResearch/flash-attention)
|
| 268 |
|
| 269 |
## Hardware
|
| 270 |
+
Note that by default, the Phi-3.5-MoE-instruct model uses flash attention, which requires certain types of GPU hardware to run. We have tested on the following GPU types:
|
| 271 |
* NVIDIA A100
|
| 272 |
* NVIDIA A6000
|
| 273 |
* NVIDIA H100
|