Datasets:
Update README.md
Browse files
README.md
CHANGED
|
@@ -92,7 +92,8 @@ The current value of Key1 is Value_N.
|
|
| 92 |
LLMs cannot reliably retrieve Value_N. Distribution spans value_1 to value_N, and as N increases, the answers skew increasingly toward value_1.
|
| 93 |
|
| 94 |
|
| 95 |
-
|
|
|
|
| 96 |
|
| 97 |
## Why this is challenging for LLMs:
|
| 98 |
- Multiple co-references to the same key cause strong interference.
|
|
|
|
| 92 |
LLMs cannot reliably retrieve Value_N. Distribution spans value_1 to value_N, and as N increases, the answers skew increasingly toward value_1.
|
| 93 |
|
| 94 |
|
| 95 |
+
## Note:
|
| 96 |
+
We **RANDOMIZE** update order after generation to mimic unpredictable changes. Counterintuitively, this often helps LLMs, since the final update usually lands near the end of the context; And in the sequential setting, most smaller **(less than arouond 600B)** models lose track after only a few updates—even with 5–8k-token inputs.(sequential mode dataset provided separately)
|
| 97 |
|
| 98 |
## Why this is challenging for LLMs:
|
| 99 |
- Multiple co-references to the same key cause strong interference.
|