Datasets:
question
string | choices
list | answer
int64 | answer_label
string | split
string | subcategories
string | category
string | lang
string | second_lang
string | notes
string | id
string | set_id
float64 | variation_id
float64 | vanilla_cos_sim_to_canonical
dict | trimmed_cos_sim_to_canonical
dict | token_counts
dict |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Dr Smith is a doctor. Occupation of Dr Smith is
|
[
"teach",
"judge",
"law",
"doc"
] | 3 |
D
|
test
|
Abbreviations
|
Social Media & Informal Text
|
eng_Latn
|
300-0.2
| 300 | 0.2 |
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1.0000001192092896,
"common-pile/comma-v0.1-1t": 0.9999998211860657,
"facebook/xglm-564M": 1.0000001192092896,
"google-bert/bert-base-multilingual-cased": 0.9999998211860657,
"google/byt5-small": 1.0000001192092896,
"google/gemma-2-2b": 0.9999998807907104,
"gpt2": 0.9999998211860657,
"meta-llama/Llama-3.2-1B": 1.0000001192092896,
"microsoft/Phi-3-mini-4k-instruct": 0.9999998807907104,
"mistralai/tekken": 0.9999998807907104,
"tiktoken/gpt-4o": 0.9999999403953552,
"tokenmonster/englishcode-32000-consistent-v1": 1.0000001192092896
}
|
{
"CohereLabs/aya-expanse-8b": 1.0000003576278687,
"Qwen/Qwen3-8B": 1.0000004768371582,
"bigscience/bloom": 1.000000238418579,
"common-pile/comma-v0.1-1t": 1.0000004768371582,
"facebook/xglm-564M": 1.000000238418579,
"google-bert/bert-base-multilingual-cased": 1.0000004768371582,
"google/byt5-small": 1.0000004768371582,
"google/gemma-2-2b": 1.0000003576278687,
"gpt2": 1.0000004768371582,
"meta-llama/Llama-3.2-1B": 1.0000003576278687,
"microsoft/Phi-3-mini-4k-instruct": 1.0000003576278687,
"mistralai/tekken": 1.0000004768371582,
"tiktoken/gpt-4o": 1.0000004768371582,
"tokenmonster/englishcode-32000-consistent-v1": 1.0000003576278687
}
|
{
"CohereLabs/aya-expanse-8b": 11,
"Qwen/Qwen3-8B": 11,
"bigscience/bloom": 12,
"common-pile/comma-v0.1-1t": 13,
"facebook/xglm-564M": 13,
"google-bert/bert-base-multilingual-cased": 14,
"google/byt5-small": 47,
"google/gemma-2-2b": 11,
"gpt2": 12,
"meta-llama/Llama-3.2-1B": 11,
"microsoft/Phi-3-mini-4k-instruct": 13,
"mistralai/tekken": 11,
"tiktoken/gpt-4o": 12,
"tokenmonster/englishcode-32000-consistent-v1": 11
}
|
||
Dr Smith is a doctor. Occupation of Dr Smith is
|
[
"Prof",
"Hon",
"MD",
"Esq"
] | 2 |
C
|
test
|
Abbreviations
|
Social Media & Informal Text
|
eng_Latn
|
300-0.16
| 300 | 0.16 |
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1.0000001192092896,
"common-pile/comma-v0.1-1t": 0.9999998211860657,
"facebook/xglm-564M": 1.0000001192092896,
"google-bert/bert-base-multilingual-cased": 0.9999998211860657,
"google/byt5-small": 1.0000001192092896,
"google/gemma-2-2b": 0.9999998807907104,
"gpt2": 0.9999998211860657,
"meta-llama/Llama-3.2-1B": 1.0000001192092896,
"microsoft/Phi-3-mini-4k-instruct": 0.9999998807907104,
"mistralai/tekken": 0.9999998807907104,
"tiktoken/gpt-4o": 0.9999999403953552,
"tokenmonster/englishcode-32000-consistent-v1": 1.0000001192092896
}
|
{
"CohereLabs/aya-expanse-8b": 1.0000003576278687,
"Qwen/Qwen3-8B": 1.0000004768371582,
"bigscience/bloom": 1.000000238418579,
"common-pile/comma-v0.1-1t": 1.0000004768371582,
"facebook/xglm-564M": 1.000000238418579,
"google-bert/bert-base-multilingual-cased": 1.0000004768371582,
"google/byt5-small": 1.0000004768371582,
"google/gemma-2-2b": 1.0000003576278687,
"gpt2": 1.0000004768371582,
"meta-llama/Llama-3.2-1B": 1.0000003576278687,
"microsoft/Phi-3-mini-4k-instruct": 1.0000003576278687,
"mistralai/tekken": 1.0000004768371582,
"tiktoken/gpt-4o": 1.0000004768371582,
"tokenmonster/englishcode-32000-consistent-v1": 1.0000003576278687
}
|
{
"CohereLabs/aya-expanse-8b": 11,
"Qwen/Qwen3-8B": 11,
"bigscience/bloom": 12,
"common-pile/comma-v0.1-1t": 13,
"facebook/xglm-564M": 13,
"google-bert/bert-base-multilingual-cased": 14,
"google/byt5-small": 47,
"google/gemma-2-2b": 11,
"gpt2": 12,
"meta-llama/Llama-3.2-1B": 11,
"microsoft/Phi-3-mini-4k-instruct": 13,
"mistralai/tekken": 11,
"tiktoken/gpt-4o": 12,
"tokenmonster/englishcode-32000-consistent-v1": 11
}
|
||
Dr Smith is an MD. Occipation of Dr Smith is
|
[
"teacher",
"doctor",
"judge",
"lawyer"
] | 1 |
B
|
test
|
Abbreviations
|
Social Media & Informal Text
|
eng_Latn
|
300-0.37
| 300 | 0.37 |
{
"CohereLabs/aya-expanse-8b": 0.8486781716346741,
"Qwen/Qwen3-8B": 0.8233134150505066,
"bigscience/bloom": 0.873887836933136,
"common-pile/comma-v0.1-1t": 0.7490841150283813,
"facebook/xglm-564M": 0.8408849239349365,
"google-bert/bert-base-multilingual-cased": 0.7985623478889465,
"google/byt5-small": 0.9447391033172607,
"google/gemma-2-2b": 0.835648775100708,
"gpt2": 0.8251903057098389,
"meta-llama/Llama-3.2-1B": 0.8439704775810242,
"microsoft/Phi-3-mini-4k-instruct": 0.8045406937599182,
"mistralai/tekken": 0.8300552368164062,
"tiktoken/gpt-4o": 0.8318107724189758,
"tokenmonster/englishcode-32000-consistent-v1": 0.7099841237068176
}
|
{
"CohereLabs/aya-expanse-8b": 0.3852706253528595,
"Qwen/Qwen3-8B": 0.23539593815803528,
"bigscience/bloom": 0.4065379500389099,
"common-pile/comma-v0.1-1t": 0.13768307864665985,
"facebook/xglm-564M": 0.3379814028739929,
"google-bert/bert-base-multilingual-cased": 0.4463389217853546,
"google/byt5-small": 0.6483171582221985,
"google/gemma-2-2b": 0.35768020153045654,
"gpt2": 0.3287959694862366,
"meta-llama/Llama-3.2-1B": 0.3190908432006836,
"microsoft/Phi-3-mini-4k-instruct": 0.2861732244491577,
"mistralai/tekken": 0.27699482440948486,
"tiktoken/gpt-4o": 0.3433341383934021,
"tokenmonster/englishcode-32000-consistent-v1": 0.19402597844600677
}
|
{
"CohereLabs/aya-expanse-8b": 12,
"Qwen/Qwen3-8B": 12,
"bigscience/bloom": 13,
"common-pile/comma-v0.1-1t": 15,
"facebook/xglm-564M": 13,
"google-bert/bert-base-multilingual-cased": 14,
"google/byt5-small": 44,
"google/gemma-2-2b": 12,
"gpt2": 12,
"meta-llama/Llama-3.2-1B": 12,
"microsoft/Phi-3-mini-4k-instruct": 14,
"mistralai/tekken": 12,
"tiktoken/gpt-4o": 12,
"tokenmonster/englishcode-32000-consistent-v1": 16
}
|
||
The # of continents on Earth is
|
[
"5",
"6",
"8",
"7"
] | 3 |
D
|
test
|
Abbreviations
|
Social Media & Informal Text
|
eng_Latn
|
304-0.2
| 304 | 0.2 |
{
"CohereLabs/aya-expanse-8b": 0.8553209900856018,
"Qwen/Qwen3-8B": 0.8774767518043518,
"bigscience/bloom": 0.8670704960823059,
"common-pile/comma-v0.1-1t": 0.8822222352027893,
"facebook/xglm-564M": 0.8440374135971069,
"google-bert/bert-base-multilingual-cased": 0.8720088005065918,
"google/byt5-small": 0.9478123188018799,
"google/gemma-2-2b": 0.8665192127227783,
"gpt2": 0.8531822562217712,
"meta-llama/Llama-3.2-1B": 0.8680265545845032,
"microsoft/Phi-3-mini-4k-instruct": 0.8711134791374207,
"mistralai/tekken": 0.8590569496154785,
"tiktoken/gpt-4o": 0.8604445457458496,
"tokenmonster/englishcode-32000-consistent-v1": 0.7292405962944031
}
|
{
"CohereLabs/aya-expanse-8b": 0.15550069510936737,
"Qwen/Qwen3-8B": 0.17457246780395508,
"bigscience/bloom": 0.18650875985622406,
"common-pile/comma-v0.1-1t": 0.2236202359199524,
"facebook/xglm-564M": 0.05189888924360275,
"google-bert/bert-base-multilingual-cased": 0.11308808624744415,
"google/byt5-small": -0.02643284946680069,
"google/gemma-2-2b": 0.173978790640831,
"gpt2": 0.15644428133964539,
"meta-llama/Llama-3.2-1B": 0.19826795160770416,
"microsoft/Phi-3-mini-4k-instruct": 0.0790470540523529,
"mistralai/tekken": 0.16212503612041473,
"tiktoken/gpt-4o": 0.1929726004600525,
"tokenmonster/englishcode-32000-consistent-v1": 0.1629045158624649
}
|
{
"CohereLabs/aya-expanse-8b": 7,
"Qwen/Qwen3-8B": 7,
"bigscience/bloom": 7,
"common-pile/comma-v0.1-1t": 8,
"facebook/xglm-564M": 8,
"google-bert/bert-base-multilingual-cased": 8,
"google/byt5-small": 31,
"google/gemma-2-2b": 7,
"gpt2": 7,
"meta-llama/Llama-3.2-1B": 7,
"microsoft/Phi-3-mini-4k-instruct": 8,
"mistralai/tekken": 7,
"tiktoken/gpt-4o": 7,
"tokenmonster/englishcode-32000-consistent-v1": 7
}
|
||
The no. of continents on Earth is
|
[
"5",
"7",
"6",
"8"
] | 1 |
B
|
test
|
Abbreviations
|
Social Media & Informal Text
|
eng_Latn
|
304-0.15
| 304 | 0.15 |
{
"CohereLabs/aya-expanse-8b": 0.7763581871986389,
"Qwen/Qwen3-8B": 0.824273943901062,
"bigscience/bloom": 0.811360776424408,
"common-pile/comma-v0.1-1t": 0.8064047694206238,
"facebook/xglm-564M": 0.8018571138381958,
"google-bert/bert-base-multilingual-cased": 0.8182386159896851,
"google/byt5-small": 0.9467005133628845,
"google/gemma-2-2b": 0.804913341999054,
"gpt2": 0.7843954563140869,
"meta-llama/Llama-3.2-1B": 0.7866487503051758,
"microsoft/Phi-3-mini-4k-instruct": 0.8095418810844421,
"mistralai/tekken": 0.7949608564376831,
"tiktoken/gpt-4o": 0.7873102426528931,
"tokenmonster/englishcode-32000-consistent-v1": 0.7079726457595825
}
|
{
"CohereLabs/aya-expanse-8b": 0.05974763259291649,
"Qwen/Qwen3-8B": 0.03043564409017563,
"bigscience/bloom": 0.07808922976255417,
"common-pile/comma-v0.1-1t": -0.02923409827053547,
"facebook/xglm-564M": 0.06573980301618576,
"google-bert/bert-base-multilingual-cased": -0.015136616304516792,
"google/byt5-small": -0.020509906113147736,
"google/gemma-2-2b": 0.0652402713894844,
"gpt2": 0.008664418943226337,
"meta-llama/Llama-3.2-1B": -0.008986718952655792,
"microsoft/Phi-3-mini-4k-instruct": -0.02389761433005333,
"mistralai/tekken": 0.06866628676652908,
"tiktoken/gpt-4o": 0.03231961280107498,
"tokenmonster/englishcode-32000-consistent-v1": 0.13605159521102905
}
|
{
"CohereLabs/aya-expanse-8b": 8,
"Qwen/Qwen3-8B": 8,
"bigscience/bloom": 8,
"common-pile/comma-v0.1-1t": 9,
"facebook/xglm-564M": 9,
"google-bert/bert-base-multilingual-cased": 9,
"google/byt5-small": 33,
"google/gemma-2-2b": 8,
"gpt2": 8,
"meta-llama/Llama-3.2-1B": 8,
"microsoft/Phi-3-mini-4k-instruct": 9,
"mistralai/tekken": 8,
"tiktoken/gpt-4o": 8,
"tokenmonster/englishcode-32000-consistent-v1": 7
}
|
||
The capital city of IR is
|
[
"Mashhad",
"Tehran",
"Baghdad",
"Isfahan"
] | 1 |
B
|
test
|
Abbreviations
|
Social Media & Informal Text
|
eng_Latn
|
305-0.2
| 305 | 0.2 |
{
"CohereLabs/aya-expanse-8b": 0.8709415793418884,
"Qwen/Qwen3-8B": 0.8861888647079468,
"bigscience/bloom": 0.8826625347137451,
"common-pile/comma-v0.1-1t": 0.8774277567863464,
"facebook/xglm-564M": 0.868166446685791,
"google-bert/bert-base-multilingual-cased": 0.8571589589118958,
"google/byt5-small": 0.9628745317459106,
"google/gemma-2-2b": 0.875096321105957,
"gpt2": 0.8794789910316467,
"meta-llama/Llama-3.2-1B": 0.8609308004379272,
"microsoft/Phi-3-mini-4k-instruct": 0.864482581615448,
"mistralai/tekken": 0.851871907711029,
"tiktoken/gpt-4o": 0.8562781810760498,
"tokenmonster/englishcode-32000-consistent-v1": 0.7831379771232605
}
|
{
"CohereLabs/aya-expanse-8b": 0.15519516170024872,
"Qwen/Qwen3-8B": 0.22655273973941803,
"bigscience/bloom": 0.24181559681892395,
"common-pile/comma-v0.1-1t": 0.22090619802474976,
"facebook/xglm-564M": 0.1887182891368866,
"google-bert/bert-base-multilingual-cased": 0.15784157812595367,
"google/byt5-small": 0.24735668301582336,
"google/gemma-2-2b": 0.18847647309303284,
"gpt2": 0.25452691316604614,
"meta-llama/Llama-3.2-1B": 0.16874276101589203,
"microsoft/Phi-3-mini-4k-instruct": 0.13977181911468506,
"mistralai/tekken": 0.13231658935546875,
"tiktoken/gpt-4o": 0.1192522943019867,
"tokenmonster/englishcode-32000-consistent-v1": 0.10740450024604797
}
|
{
"CohereLabs/aya-expanse-8b": 6,
"Qwen/Qwen3-8B": 6,
"bigscience/bloom": 6,
"common-pile/comma-v0.1-1t": 7,
"facebook/xglm-564M": 6,
"google-bert/bert-base-multilingual-cased": 6,
"google/byt5-small": 25,
"google/gemma-2-2b": 6,
"gpt2": 6,
"meta-llama/Llama-3.2-1B": 6,
"microsoft/Phi-3-mini-4k-instruct": 6,
"mistralai/tekken": 6,
"tiktoken/gpt-4o": 6,
"tokenmonster/englishcode-32000-consistent-v1": 6
}
|
||
The # of days in a week is
|
[
"7",
"5",
"6",
"8"
] | 0 |
A
|
test
|
Abbreviations
|
Social Media & Informal Text
|
eng_Latn
|
306-0.3
| 306 | 0.3 |
{
"CohereLabs/aya-expanse-8b": 0.8781460523605347,
"Qwen/Qwen3-8B": 0.8977583050727844,
"bigscience/bloom": 0.8890756368637085,
"common-pile/comma-v0.1-1t": 0.9004440307617188,
"facebook/xglm-564M": 0.8400768041610718,
"google-bert/bert-base-multilingual-cased": 0.8637771010398865,
"google/byt5-small": 0.9125043749809265,
"google/gemma-2-2b": 0.8880384564399719,
"gpt2": 0.8734878897666931,
"meta-llama/Llama-3.2-1B": 0.8814516067504883,
"microsoft/Phi-3-mini-4k-instruct": 0.8667116165161133,
"mistralai/tekken": 0.8863945007324219,
"tiktoken/gpt-4o": 0.8898459672927856,
"tokenmonster/englishcode-32000-consistent-v1": 0.758175253868103
}
|
{
"CohereLabs/aya-expanse-8b": 0.15550069510936737,
"Qwen/Qwen3-8B": 0.17457246780395508,
"bigscience/bloom": 0.18650875985622406,
"common-pile/comma-v0.1-1t": 0.2236202359199524,
"facebook/xglm-564M": 0.05189888924360275,
"google-bert/bert-base-multilingual-cased": 0.11308808624744415,
"google/byt5-small": -0.02643284946680069,
"google/gemma-2-2b": 0.173978790640831,
"gpt2": 0.15644428133964539,
"meta-llama/Llama-3.2-1B": 0.19826795160770416,
"microsoft/Phi-3-mini-4k-instruct": 0.0790470540523529,
"mistralai/tekken": 0.16212503612041473,
"tiktoken/gpt-4o": 0.1929726004600525,
"tokenmonster/englishcode-32000-consistent-v1": 0.1629045158624649
}
|
{
"CohereLabs/aya-expanse-8b": 8,
"Qwen/Qwen3-8B": 8,
"bigscience/bloom": 8,
"common-pile/comma-v0.1-1t": 10,
"facebook/xglm-564M": 8,
"google-bert/bert-base-multilingual-cased": 8,
"google/byt5-small": 26,
"google/gemma-2-2b": 8,
"gpt2": 8,
"meta-llama/Llama-3.2-1B": 8,
"microsoft/Phi-3-mini-4k-instruct": 8,
"mistralai/tekken": 8,
"tiktoken/gpt-4o": 8,
"tokenmonster/englishcode-32000-consistent-v1": 7
}
|
||
The # of hours in a day is
|
[
"20",
"24",
"25",
"30"
] | 1 |
B
|
test
|
Abbreviations
|
Social Media & Informal Text
|
eng_Latn
|
307-0.3
| 307 | 0.3 |
{
"CohereLabs/aya-expanse-8b": 0.8746010661125183,
"Qwen/Qwen3-8B": 0.8972941040992737,
"bigscience/bloom": 0.8836281299591064,
"common-pile/comma-v0.1-1t": 0.8991660475730896,
"facebook/xglm-564M": 0.8364160656929016,
"google-bert/bert-base-multilingual-cased": 0.862697422504425,
"google/byt5-small": 0.9182883501052856,
"google/gemma-2-2b": 0.8852136135101318,
"gpt2": 0.8709038496017456,
"meta-llama/Llama-3.2-1B": 0.8853892683982849,
"microsoft/Phi-3-mini-4k-instruct": 0.8643823862075806,
"mistralai/tekken": 0.8764235377311707,
"tiktoken/gpt-4o": 0.8882663249969482,
"tokenmonster/englishcode-32000-consistent-v1": 0.7297150492668152
}
|
{
"CohereLabs/aya-expanse-8b": 0.15550069510936737,
"Qwen/Qwen3-8B": 0.17457246780395508,
"bigscience/bloom": 0.18650875985622406,
"common-pile/comma-v0.1-1t": 0.2236202359199524,
"facebook/xglm-564M": 0.05189888924360275,
"google-bert/bert-base-multilingual-cased": 0.11308808624744415,
"google/byt5-small": -0.02643284946680069,
"google/gemma-2-2b": 0.173978790640831,
"gpt2": 0.15644428133964539,
"meta-llama/Llama-3.2-1B": 0.19826795160770416,
"microsoft/Phi-3-mini-4k-instruct": 0.0790470540523529,
"mistralai/tekken": 0.16212503612041473,
"tiktoken/gpt-4o": 0.1929726004600525,
"tokenmonster/englishcode-32000-consistent-v1": 0.1629045158624649
}
|
{
"CohereLabs/aya-expanse-8b": 8,
"Qwen/Qwen3-8B": 8,
"bigscience/bloom": 8,
"common-pile/comma-v0.1-1t": 10,
"facebook/xglm-564M": 8,
"google-bert/bert-base-multilingual-cased": 8,
"google/byt5-small": 26,
"google/gemma-2-2b": 8,
"gpt2": 8,
"meta-llama/Llama-3.2-1B": 8,
"microsoft/Phi-3-mini-4k-instruct": 8,
"mistralai/tekken": 8,
"tiktoken/gpt-4o": 8,
"tokenmonster/englishcode-32000-consistent-v1": 7
}
|
||
The # of legs a cow has is
|
[
"8",
"3",
"5",
"4"
] | 3 |
D
|
test
|
Abbreviations
|
Social Media & Informal Text
|
eng_Latn
|
308-0.3
| 308 | 0.3 |
{
"CohereLabs/aya-expanse-8b": 0.8787575960159302,
"Qwen/Qwen3-8B": 0.903361439704895,
"bigscience/bloom": 0.8900041580200195,
"common-pile/comma-v0.1-1t": 0.8913084268569946,
"facebook/xglm-564M": 0.8609603047370911,
"google-bert/bert-base-multilingual-cased": 0.8892489671707153,
"google/byt5-small": 0.9177855849266052,
"google/gemma-2-2b": 0.8813795447349548,
"gpt2": 0.877772331237793,
"meta-llama/Llama-3.2-1B": 0.8910291790962219,
"microsoft/Phi-3-mini-4k-instruct": 0.8654680848121643,
"mistralai/tekken": 0.8807762265205383,
"tiktoken/gpt-4o": 0.8883159160614014,
"tokenmonster/englishcode-32000-consistent-v1": 0.7710891962051392
}
|
{
"CohereLabs/aya-expanse-8b": 0.15550069510936737,
"Qwen/Qwen3-8B": 0.17457246780395508,
"bigscience/bloom": 0.18650875985622406,
"common-pile/comma-v0.1-1t": 0.2236202359199524,
"facebook/xglm-564M": 0.05189888924360275,
"google-bert/bert-base-multilingual-cased": 0.11308808624744415,
"google/byt5-small": -0.02643284946680069,
"google/gemma-2-2b": 0.173978790640831,
"gpt2": 0.15644428133964539,
"meta-llama/Llama-3.2-1B": 0.19826795160770416,
"microsoft/Phi-3-mini-4k-instruct": 0.0790470540523529,
"mistralai/tekken": 0.16212503612041473,
"tiktoken/gpt-4o": 0.1929726004600525,
"tokenmonster/englishcode-32000-consistent-v1": 0.1629045158624649
}
|
{
"CohereLabs/aya-expanse-8b": 8,
"Qwen/Qwen3-8B": 8,
"bigscience/bloom": 8,
"common-pile/comma-v0.1-1t": 9,
"facebook/xglm-564M": 8,
"google-bert/bert-base-multilingual-cased": 9,
"google/byt5-small": 26,
"google/gemma-2-2b": 8,
"gpt2": 8,
"meta-llama/Llama-3.2-1B": 8,
"microsoft/Phi-3-mini-4k-instruct": 8,
"mistralai/tekken": 8,
"tiktoken/gpt-4o": 8,
"tokenmonster/englishcode-32000-consistent-v1": 8
}
|
||
The # of minutes in 2 hours is
|
[
"100",
"120",
"140",
"90"
] | 1 |
B
|
test
|
Abbreviations
|
Social Media & Informal Text
|
eng_Latn
|
309-0.3
| 309 | 0.3 |
{
"CohereLabs/aya-expanse-8b": 0.8944236636161804,
"Qwen/Qwen3-8B": 0.9028605222702026,
"bigscience/bloom": 0.901308000087738,
"common-pile/comma-v0.1-1t": 0.9184684753417969,
"facebook/xglm-564M": 0.8589028716087341,
"google-bert/bert-base-multilingual-cased": 0.8760905265808105,
"google/byt5-small": 0.9449029564857483,
"google/gemma-2-2b": 0.9074434041976929,
"gpt2": 0.8900853991508484,
"meta-llama/Llama-3.2-1B": 0.9088870286941528,
"microsoft/Phi-3-mini-4k-instruct": 0.8868902325630188,
"mistralai/tekken": 0.9065263271331787,
"tiktoken/gpt-4o": 0.9071903824806213,
"tokenmonster/englishcode-32000-consistent-v1": 0.7755128145217896
}
|
{
"CohereLabs/aya-expanse-8b": 0.15550069510936737,
"Qwen/Qwen3-8B": 0.17457246780395508,
"bigscience/bloom": 0.18650875985622406,
"common-pile/comma-v0.1-1t": 0.2236202359199524,
"facebook/xglm-564M": 0.05189888924360275,
"google-bert/bert-base-multilingual-cased": 0.11308808624744415,
"google/byt5-small": -0.02643284946680069,
"google/gemma-2-2b": 0.173978790640831,
"gpt2": 0.15644428133964539,
"meta-llama/Llama-3.2-1B": 0.19826795160770416,
"microsoft/Phi-3-mini-4k-instruct": 0.0790470540523529,
"mistralai/tekken": 0.16212503612041473,
"tiktoken/gpt-4o": 0.1929726004600525,
"tokenmonster/englishcode-32000-consistent-v1": 0.1629045158624649
}
|
{
"CohereLabs/aya-expanse-8b": 9,
"Qwen/Qwen3-8B": 9,
"bigscience/bloom": 8,
"common-pile/comma-v0.1-1t": 11,
"facebook/xglm-564M": 8,
"google-bert/bert-base-multilingual-cased": 8,
"google/byt5-small": 30,
"google/gemma-2-2b": 9,
"gpt2": 8,
"meta-llama/Llama-3.2-1B": 9,
"microsoft/Phi-3-mini-4k-instruct": 9,
"mistralai/tekken": 9,
"tiktoken/gpt-4o": 9,
"tokenmonster/englishcode-32000-consistent-v1": 8
}
|
||
The # of months in a year is
|
[
"10",
"11",
"13",
"12"
] | 3 |
D
|
test
|
Abbreviations
|
Social Media & Informal Text
|
eng_Latn
|
310-0.3
| 310 | 0.3 |
{
"CohereLabs/aya-expanse-8b": 0.8758044838905334,
"Qwen/Qwen3-8B": 0.8943588733673096,
"bigscience/bloom": 0.8882232308387756,
"common-pile/comma-v0.1-1t": 0.9044252634048462,
"facebook/xglm-564M": 0.8348524570465088,
"google-bert/bert-base-multilingual-cased": 0.8641276359558105,
"google/byt5-small": 0.9306106567382812,
"google/gemma-2-2b": 0.8897289633750916,
"gpt2": 0.8724748492240906,
"meta-llama/Llama-3.2-1B": 0.8823840022087097,
"microsoft/Phi-3-mini-4k-instruct": 0.8683964610099792,
"mistralai/tekken": 0.8803357481956482,
"tiktoken/gpt-4o": 0.8929274678230286,
"tokenmonster/englishcode-32000-consistent-v1": 0.7467564344406128
}
|
{
"CohereLabs/aya-expanse-8b": 0.15550069510936737,
"Qwen/Qwen3-8B": 0.17457246780395508,
"bigscience/bloom": 0.18650875985622406,
"common-pile/comma-v0.1-1t": 0.2236202359199524,
"facebook/xglm-564M": 0.05189888924360275,
"google-bert/bert-base-multilingual-cased": 0.11308808624744415,
"google/byt5-small": -0.02643284946680069,
"google/gemma-2-2b": 0.173978790640831,
"gpt2": 0.15644428133964539,
"meta-llama/Llama-3.2-1B": 0.19826795160770416,
"microsoft/Phi-3-mini-4k-instruct": 0.0790470540523529,
"mistralai/tekken": 0.16212503612041473,
"tiktoken/gpt-4o": 0.1929726004600525,
"tokenmonster/englishcode-32000-consistent-v1": 0.1629045158624649
}
|
{
"CohereLabs/aya-expanse-8b": 8,
"Qwen/Qwen3-8B": 8,
"bigscience/bloom": 8,
"common-pile/comma-v0.1-1t": 10,
"facebook/xglm-564M": 8,
"google-bert/bert-base-multilingual-cased": 8,
"google/byt5-small": 28,
"google/gemma-2-2b": 8,
"gpt2": 8,
"meta-llama/Llama-3.2-1B": 8,
"microsoft/Phi-3-mini-4k-instruct": 8,
"mistralai/tekken": 8,
"tiktoken/gpt-4o": 8,
"tokenmonster/englishcode-32000-consistent-v1": 7
}
|
||
The # of seconds in a minute is
|
[
"50",
"100",
"60",
"30"
] | 2 |
C
|
test
|
Abbreviations
|
Social Media & Informal Text
|
eng_Latn
|
311-0.3
| 311 | 0.3 |
{
"CohereLabs/aya-expanse-8b": 0.8631638884544373,
"Qwen/Qwen3-8B": 0.8879849910736084,
"bigscience/bloom": 0.8815665245056152,
"common-pile/comma-v0.1-1t": 0.8988855481147766,
"facebook/xglm-564M": 0.825326144695282,
"google-bert/bert-base-multilingual-cased": 0.8537793159484863,
"google/byt5-small": 0.9459080100059509,
"google/gemma-2-2b": 0.8843560218811035,
"gpt2": 0.8683614134788513,
"meta-llama/Llama-3.2-1B": 0.8789793848991394,
"microsoft/Phi-3-mini-4k-instruct": 0.8600611090660095,
"mistralai/tekken": 0.8747711777687073,
"tiktoken/gpt-4o": 0.8857784271240234,
"tokenmonster/englishcode-32000-consistent-v1": 0.7317348122596741
}
|
{
"CohereLabs/aya-expanse-8b": 0.15550069510936737,
"Qwen/Qwen3-8B": 0.17457246780395508,
"bigscience/bloom": 0.18650875985622406,
"common-pile/comma-v0.1-1t": 0.2236202359199524,
"facebook/xglm-564M": 0.05189888924360275,
"google-bert/bert-base-multilingual-cased": 0.11308808624744415,
"google/byt5-small": -0.02643284946680069,
"google/gemma-2-2b": 0.173978790640831,
"gpt2": 0.15644428133964539,
"meta-llama/Llama-3.2-1B": 0.19826795160770416,
"microsoft/Phi-3-mini-4k-instruct": 0.0790470540523529,
"mistralai/tekken": 0.16212503612041473,
"tiktoken/gpt-4o": 0.1929726004600525,
"tokenmonster/englishcode-32000-consistent-v1": 0.1629045158624649
}
|
{
"CohereLabs/aya-expanse-8b": 8,
"Qwen/Qwen3-8B": 8,
"bigscience/bloom": 8,
"common-pile/comma-v0.1-1t": 10,
"facebook/xglm-564M": 8,
"google-bert/bert-base-multilingual-cased": 8,
"google/byt5-small": 31,
"google/gemma-2-2b": 8,
"gpt2": 8,
"meta-llama/Llama-3.2-1B": 8,
"microsoft/Phi-3-mini-4k-instruct": 8,
"mistralai/tekken": 8,
"tiktoken/gpt-4o": 8,
"tokenmonster/englishcode-32000-consistent-v1": 7
}
|
||
The # of sides a hexagon has is
|
[
"5",
"7",
"8",
"6"
] | 3 |
D
|
test
|
Abbreviations
|
eng_Latn
|
312-0.3
| 312 | 0.3 |
{
"CohereLabs/aya-expanse-8b": 0.8634425401687622,
"Qwen/Qwen3-8B": 0.9136554002761841,
"bigscience/bloom": 0.9019626975059509,
"common-pile/comma-v0.1-1t": 0.897915244102478,
"facebook/xglm-564M": 0.904183030128479,
"google-bert/bert-base-multilingual-cased": 0.8995038270950317,
"google/byt5-small": 0.9488453269004822,
"google/gemma-2-2b": 0.8825085163116455,
"gpt2": 0.8939532041549683,
"meta-llama/Llama-3.2-1B": 0.9000940322875977,
"microsoft/Phi-3-mini-4k-instruct": 0.8842352628707886,
"mistralai/tekken": 0.8938607573509216,
"tiktoken/gpt-4o": 0.9065819382667542,
"tokenmonster/englishcode-32000-consistent-v1": 0.8501964807510376
}
|
{
"CohereLabs/aya-expanse-8b": 0.15550069510936737,
"Qwen/Qwen3-8B": 0.17457246780395508,
"bigscience/bloom": 0.18650875985622406,
"common-pile/comma-v0.1-1t": 0.2236202359199524,
"facebook/xglm-564M": 0.05189888924360275,
"google-bert/bert-base-multilingual-cased": 0.11308808624744415,
"google/byt5-small": -0.02643284946680069,
"google/gemma-2-2b": 0.173978790640831,
"gpt2": 0.15644428133964539,
"meta-llama/Llama-3.2-1B": 0.19826795160770416,
"microsoft/Phi-3-mini-4k-instruct": 0.0790470540523529,
"mistralai/tekken": 0.16212503612041473,
"tiktoken/gpt-4o": 0.1929726004600525,
"tokenmonster/englishcode-32000-consistent-v1": 0.1629045158624649
}
|
{
"CohereLabs/aya-expanse-8b": 8,
"Qwen/Qwen3-8B": 9,
"bigscience/bloom": 9,
"common-pile/comma-v0.1-1t": 10,
"facebook/xglm-564M": 10,
"google-bert/bert-base-multilingual-cased": 10,
"google/byt5-small": 31,
"google/gemma-2-2b": 8,
"gpt2": 9,
"meta-llama/Llama-3.2-1B": 9,
"microsoft/Phi-3-mini-4k-instruct": 9,
"mistralai/tekken": 9,
"tiktoken/gpt-4o": 9,
"tokenmonster/englishcode-32000-consistent-v1": 11
}
|
|||
The # of sides a triangle has is
|
[
"3",
"2",
"4",
"5"
] | 0 |
A
|
test
|
Abbreviations
|
Social Media & Informal Text
|
eng_Latn
|
313-0.3
| 313 | 0.3 |
{
"CohereLabs/aya-expanse-8b": 0.8722261786460876,
"Qwen/Qwen3-8B": 0.9027498364448547,
"bigscience/bloom": 0.8900589942932129,
"common-pile/comma-v0.1-1t": 0.8905171155929565,
"facebook/xglm-564M": 0.8620559573173523,
"google-bert/bert-base-multilingual-cased": 0.8632290959358215,
"google/byt5-small": 0.9473434090614319,
"google/gemma-2-2b": 0.8901822566986084,
"gpt2": 0.8811707496643066,
"meta-llama/Llama-3.2-1B": 0.8887614607810974,
"microsoft/Phi-3-mini-4k-instruct": 0.8653461933135986,
"mistralai/tekken": 0.8795601725578308,
"tiktoken/gpt-4o": 0.8902370929718018,
"tokenmonster/englishcode-32000-consistent-v1": 0.7794355750083923
}
|
{
"CohereLabs/aya-expanse-8b": 0.15550069510936737,
"Qwen/Qwen3-8B": 0.17457246780395508,
"bigscience/bloom": 0.18650875985622406,
"common-pile/comma-v0.1-1t": 0.2236202359199524,
"facebook/xglm-564M": 0.05189888924360275,
"google-bert/bert-base-multilingual-cased": 0.11308808624744415,
"google/byt5-small": -0.02643284946680069,
"google/gemma-2-2b": 0.173978790640831,
"gpt2": 0.15644428133964539,
"meta-llama/Llama-3.2-1B": 0.19826795160770416,
"microsoft/Phi-3-mini-4k-instruct": 0.0790470540523529,
"mistralai/tekken": 0.16212503612041473,
"tiktoken/gpt-4o": 0.1929726004600525,
"tokenmonster/englishcode-32000-consistent-v1": 0.1629045158624649
}
|
{
"CohereLabs/aya-expanse-8b": 8,
"Qwen/Qwen3-8B": 8,
"bigscience/bloom": 8,
"common-pile/comma-v0.1-1t": 9,
"facebook/xglm-564M": 8,
"google-bert/bert-base-multilingual-cased": 8,
"google/byt5-small": 32,
"google/gemma-2-2b": 8,
"gpt2": 8,
"meta-llama/Llama-3.2-1B": 8,
"microsoft/Phi-3-mini-4k-instruct": 8,
"mistralai/tekken": 8,
"tiktoken/gpt-4o": 8,
"tokenmonster/englishcode-32000-consistent-v1": 8
}
|
||
In "I work at Apple", Apple is a
|
[
"co.",
"pers.",
"cty.",
"fr."
] | 0 |
A
|
test
|
Abbreviations
|
eng_Latn
|
314-0.1
| 314 | 0.1 |
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 0.9999998807907104,
"common-pile/comma-v0.1-1t": 0.9999998807907104,
"facebook/xglm-564M": 1.000000238418579,
"google-bert/bert-base-multilingual-cased": 1.0000001192092896,
"google/byt5-small": 1.0000001192092896,
"google/gemma-2-2b": 0.9999998211860657,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1.0000001192092896,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 0.9999998807907104,
"tokenmonster/englishcode-32000-consistent-v1": 0.9999997615814209
}
|
{
"CohereLabs/aya-expanse-8b": 1.0000001192092896,
"Qwen/Qwen3-8B": 1.000000238418579,
"bigscience/bloom": 1.0000003576278687,
"common-pile/comma-v0.1-1t": 1.0000004768371582,
"facebook/xglm-564M": 1.0000004768371582,
"google-bert/bert-base-multilingual-cased": 1.000000238418579,
"google/byt5-small": 1.000000238418579,
"google/gemma-2-2b": 1.000000238418579,
"gpt2": 1.0000003576278687,
"meta-llama/Llama-3.2-1B": 1.0000003576278687,
"microsoft/Phi-3-mini-4k-instruct": 1.0000004768371582,
"mistralai/tekken": 1.0000003576278687,
"tiktoken/gpt-4o": 1.0000003576278687,
"tokenmonster/englishcode-32000-consistent-v1": 1.0000005960464478
}
|
{
"CohereLabs/aya-expanse-8b": 10,
"Qwen/Qwen3-8B": 10,
"bigscience/bloom": 10,
"common-pile/comma-v0.1-1t": 10,
"facebook/xglm-564M": 10,
"google-bert/bert-base-multilingual-cased": 11,
"google/byt5-small": 32,
"google/gemma-2-2b": 10,
"gpt2": 10,
"meta-llama/Llama-3.2-1B": 10,
"microsoft/Phi-3-mini-4k-instruct": 10,
"mistralai/tekken": 10,
"tiktoken/gpt-4o": 10,
"tokenmonster/englishcode-32000-consistent-v1": 8
}
|
|||
In "I work at Google", Google is a
|
[
"pers.",
"co.",
"cty.",
"fr."
] | 1 |
B
|
test
|
Abbreviations
|
eng_Latn
|
315-0.1
| 315 | 0.1 |
{
"CohereLabs/aya-expanse-8b": 0.9999999403953552,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 0.9999998807907104,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1.0000001192092896,
"google/byt5-small": 1.0000001192092896,
"google/gemma-2-2b": 0.9999998807907104,
"gpt2": 1.0000001192092896,
"meta-llama/Llama-3.2-1B": 0.9999999403953552,
"microsoft/Phi-3-mini-4k-instruct": 1.0000001192092896,
"mistralai/tekken": 0.9999998807907104,
"tiktoken/gpt-4o": 1.0000001192092896,
"tokenmonster/englishcode-32000-consistent-v1": 0.9999997615814209
}
|
{
"CohereLabs/aya-expanse-8b": 1.0000001192092896,
"Qwen/Qwen3-8B": 1.000000238418579,
"bigscience/bloom": 1.0000003576278687,
"common-pile/comma-v0.1-1t": 1.0000004768371582,
"facebook/xglm-564M": 1.0000004768371582,
"google-bert/bert-base-multilingual-cased": 1.000000238418579,
"google/byt5-small": 1.000000238418579,
"google/gemma-2-2b": 1.000000238418579,
"gpt2": 1.0000003576278687,
"meta-llama/Llama-3.2-1B": 1.0000003576278687,
"microsoft/Phi-3-mini-4k-instruct": 1.0000004768371582,
"mistralai/tekken": 1.0000003576278687,
"tiktoken/gpt-4o": 1.0000003576278687,
"tokenmonster/englishcode-32000-consistent-v1": 1.0000005960464478
}
|
{
"CohereLabs/aya-expanse-8b": 10,
"Qwen/Qwen3-8B": 10,
"bigscience/bloom": 10,
"common-pile/comma-v0.1-1t": 10,
"facebook/xglm-564M": 10,
"google-bert/bert-base-multilingual-cased": 11,
"google/byt5-small": 34,
"google/gemma-2-2b": 10,
"gpt2": 10,
"meta-llama/Llama-3.2-1B": 10,
"microsoft/Phi-3-mini-4k-instruct": 10,
"mistralai/tekken": 10,
"tiktoken/gpt-4o": 10,
"tokenmonster/englishcode-32000-consistent-v1": 8
}
|
|||
In "Microsoft released a new update", Microsoft is a
|
[
"co.",
"pers.",
"cty.",
"fr."
] | 0 |
A
|
test
|
Abbreviations
|
eng_Latn
|
316-0.1
| 316 | 0.1 |
{
"CohereLabs/aya-expanse-8b": 1.0000001192092896,
"Qwen/Qwen3-8B": 0.9999999403953552,
"bigscience/bloom": 0.9999998807907104,
"common-pile/comma-v0.1-1t": 1.0000001192092896,
"facebook/xglm-564M": 0.9999998211860657,
"google-bert/bert-base-multilingual-cased": 0.9999999403953552,
"google/byt5-small": 1,
"google/gemma-2-2b": 0.9999996423721313,
"gpt2": 1.000000238418579,
"meta-llama/Llama-3.2-1B": 0.9999999403953552,
"microsoft/Phi-3-mini-4k-instruct": 1.0000001192092896,
"mistralai/tekken": 1.0000001192092896,
"tiktoken/gpt-4o": 1.0000001192092896,
"tokenmonster/englishcode-32000-consistent-v1": 1.0000001192092896
}
|
{
"CohereLabs/aya-expanse-8b": 1.0000001192092896,
"Qwen/Qwen3-8B": 1.000000238418579,
"bigscience/bloom": 1.0000003576278687,
"common-pile/comma-v0.1-1t": 1.0000004768371582,
"facebook/xglm-564M": 1.0000004768371582,
"google-bert/bert-base-multilingual-cased": 1.000000238418579,
"google/byt5-small": 1.000000238418579,
"google/gemma-2-2b": 1.000000238418579,
"gpt2": 1.0000003576278687,
"meta-llama/Llama-3.2-1B": 1.0000003576278687,
"microsoft/Phi-3-mini-4k-instruct": 1.0000004768371582,
"mistralai/tekken": 1.0000003576278687,
"tiktoken/gpt-4o": 1.0000003576278687,
"tokenmonster/englishcode-32000-consistent-v1": 1.0000005960464478
}
|
{
"CohereLabs/aya-expanse-8b": 11,
"Qwen/Qwen3-8B": 11,
"bigscience/bloom": 12,
"common-pile/comma-v0.1-1t": 11,
"facebook/xglm-564M": 11,
"google-bert/bert-base-multilingual-cased": 12,
"google/byt5-small": 52,
"google/gemma-2-2b": 11,
"gpt2": 11,
"meta-llama/Llama-3.2-1B": 11,
"microsoft/Phi-3-mini-4k-instruct": 11,
"mistralai/tekken": 11,
"tiktoken/gpt-4o": 11,
"tokenmonster/englishcode-32000-consistent-v1": 9
}
|
|||
In "The cat sat on the mat", the subj. is
|
[
"sat",
"the mat",
"the cat",
"on"
] | 2 |
C
|
test
|
Abbreviations
|
eng_Latn
|
317-0.1
| 317 | 0.1 |
{
"CohereLabs/aya-expanse-8b": 0.9345534443855286,
"Qwen/Qwen3-8B": 0.9383683204650879,
"bigscience/bloom": 0.9273239374160767,
"common-pile/comma-v0.1-1t": 0.8940730094909668,
"facebook/xglm-564M": 0.8844199776649475,
"google-bert/bert-base-multilingual-cased": 0.8985786437988281,
"google/byt5-small": 0.9816980957984924,
"google/gemma-2-2b": 0.9320594668388367,
"gpt2": 0.8804003000259399,
"meta-llama/Llama-3.2-1B": 0.92516028881073,
"microsoft/Phi-3-mini-4k-instruct": 0.8588052988052368,
"mistralai/tekken": 0.8823502659797668,
"tiktoken/gpt-4o": 0.9314396381378174,
"tokenmonster/englishcode-32000-consistent-v1": 0.8803983330726624
}
|
{
"CohereLabs/aya-expanse-8b": 0.08477406203746796,
"Qwen/Qwen3-8B": 0.10374213010072708,
"bigscience/bloom": 0.08070938289165497,
"common-pile/comma-v0.1-1t": 0.07346736639738083,
"facebook/xglm-564M": 0.05824323743581772,
"google-bert/bert-base-multilingual-cased": 0.08166754245758057,
"google/byt5-small": -0.046926841139793396,
"google/gemma-2-2b": 0.0717279389500618,
"gpt2": 0.026471275836229324,
"meta-llama/Llama-3.2-1B": 0.06531772017478943,
"microsoft/Phi-3-mini-4k-instruct": -0.03549063205718994,
"mistralai/tekken": 0.07283864170312881,
"tiktoken/gpt-4o": 0.07994227856397629,
"tokenmonster/englishcode-32000-consistent-v1": 0.05385168641805649
}
|
{
"CohereLabs/aya-expanse-8b": 13,
"Qwen/Qwen3-8B": 13,
"bigscience/bloom": 13,
"common-pile/comma-v0.1-1t": 14,
"facebook/xglm-564M": 14,
"google-bert/bert-base-multilingual-cased": 15,
"google/byt5-small": 41,
"google/gemma-2-2b": 13,
"gpt2": 14,
"meta-llama/Llama-3.2-1B": 13,
"microsoft/Phi-3-mini-4k-instruct": 14,
"mistralai/tekken": 14,
"tiktoken/gpt-4o": 13,
"tokenmonster/englishcode-32000-consistent-v1": 12
}
|
|||
The gas humans need to breathe to live is
|
[
"CH₄",
"He",
"O₂",
"H₂"
] | 2 |
C
|
test
|
Abbreviations
|
eng_Latn
|
322-0.1
| 322 | 0.1 |
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1.0000001192092896,
"bigscience/bloom": 1.0000001192092896,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1.0000001192092896,
"google-bert/bert-base-multilingual-cased": 0.9999999403953552,
"google/byt5-small": 0.9999997615814209,
"google/gemma-2-2b": 1.0000001192092896,
"gpt2": 1.0000001192092896,
"meta-llama/Llama-3.2-1B": 1.0000001192092896,
"microsoft/Phi-3-mini-4k-instruct": 1.0000001192092896,
"mistralai/tekken": 1.000000238418579,
"tiktoken/gpt-4o": 1.000000238418579,
"tokenmonster/englishcode-32000-consistent-v1": 0.9999999403953552
}
|
{
"CohereLabs/aya-expanse-8b": 1.0000003576278687,
"Qwen/Qwen3-8B": 1.0000004768371582,
"bigscience/bloom": 1.000000238418579,
"common-pile/comma-v0.1-1t": 1.0000004768371582,
"facebook/xglm-564M": 1.000000238418579,
"google-bert/bert-base-multilingual-cased": 1.0000004768371582,
"google/byt5-small": 1.0000004768371582,
"google/gemma-2-2b": 1.0000003576278687,
"gpt2": 1.0000004768371582,
"meta-llama/Llama-3.2-1B": 1.0000003576278687,
"microsoft/Phi-3-mini-4k-instruct": 1.0000003576278687,
"mistralai/tekken": 1.0000004768371582,
"tiktoken/gpt-4o": 1.0000004768371582,
"tokenmonster/englishcode-32000-consistent-v1": 1.0000003576278687
}
|
{
"CohereLabs/aya-expanse-8b": 9,
"Qwen/Qwen3-8B": 9,
"bigscience/bloom": 9,
"common-pile/comma-v0.1-1t": 14,
"facebook/xglm-564M": 9,
"google-bert/bert-base-multilingual-cased": 10,
"google/byt5-small": 41,
"google/gemma-2-2b": 9,
"gpt2": 9,
"meta-llama/Llama-3.2-1B": 9,
"microsoft/Phi-3-mini-4k-instruct": 11,
"mistralai/tekken": 9,
"tiktoken/gpt-4o": 9,
"tokenmonster/englishcode-32000-consistent-v1": 8
}
|
|||
Chad's cap. is
|
[
"N'Djamena",
"Moundou",
"Abéché",
"Ngama"
] | 0 |
A
|
test
|
Abbreviations
|
Social Media & Informal Text
|
eng_Latn
|
326-0.2
| 326 | 0.2 |
{
"CohereLabs/aya-expanse-8b": 0.7013799548149109,
"Qwen/Qwen3-8B": 0.7792327404022217,
"bigscience/bloom": 0.7729296684265137,
"common-pile/comma-v0.1-1t": 0.7977029085159302,
"facebook/xglm-564M": 0.7329659461975098,
"google-bert/bert-base-multilingual-cased": 0.7466813325881958,
"google/byt5-small": 0.9061909317970276,
"google/gemma-2-2b": 0.7286580204963684,
"gpt2": 0.783542275428772,
"meta-llama/Llama-3.2-1B": 0.7782202959060669,
"microsoft/Phi-3-mini-4k-instruct": 0.8116836547851562,
"mistralai/tekken": 0.7620350122451782,
"tiktoken/gpt-4o": 0.781197190284729,
"tokenmonster/englishcode-32000-consistent-v1": 0.8143059015274048
}
|
{
"CohereLabs/aya-expanse-8b": 0.12239085137844086,
"Qwen/Qwen3-8B": 0.054030366241931915,
"bigscience/bloom": 0.02676394209265709,
"common-pile/comma-v0.1-1t": 0.05686108395457268,
"facebook/xglm-564M": 0.06816975772380829,
"google-bert/bert-base-multilingual-cased": 0.057269658893346786,
"google/byt5-small": -0.02295071817934513,
"google/gemma-2-2b": 0.09922298789024353,
"gpt2": 0.10866591334342957,
"meta-llama/Llama-3.2-1B": 0.09607133269309998,
"microsoft/Phi-3-mini-4k-instruct": 0.13172483444213867,
"mistralai/tekken": 0.04374687373638153,
"tiktoken/gpt-4o": 0.08728465437889099,
"tokenmonster/englishcode-32000-consistent-v1": 0.06787453591823578
}
|
{
"CohereLabs/aya-expanse-8b": 5,
"Qwen/Qwen3-8B": 6,
"bigscience/bloom": 6,
"common-pile/comma-v0.1-1t": 6,
"facebook/xglm-564M": 6,
"google-bert/bert-base-multilingual-cased": 6,
"google/byt5-small": 14,
"google/gemma-2-2b": 6,
"gpt2": 6,
"meta-llama/Llama-3.2-1B": 6,
"microsoft/Phi-3-mini-4k-instruct": 7,
"mistralai/tekken": 6,
"tiktoken/gpt-4o": 6,
"tokenmonster/englishcode-32000-consistent-v1": 7
}
|
||
The capital of FR is
|
[
"London",
"Paris",
"Berlin",
"Rome"
] | 1 |
B
|
test
|
Abbreviations
|
Social Media & Informal Text
|
eng_Latn
|
327-0.2
| 327 | 0.2 |
{
"CohereLabs/aya-expanse-8b": 0.807169497013092,
"Qwen/Qwen3-8B": 0.8451246023178101,
"bigscience/bloom": 0.8283455967903137,
"common-pile/comma-v0.1-1t": 0.8237259984016418,
"facebook/xglm-564M": 0.806121826171875,
"google-bert/bert-base-multilingual-cased": 0.7981747388839722,
"google/byt5-small": 0.9172342419624329,
"google/gemma-2-2b": 0.7995125651359558,
"gpt2": 0.8134028315544128,
"meta-llama/Llama-3.2-1B": 0.8076421618461609,
"microsoft/Phi-3-mini-4k-instruct": 0.8156378865242004,
"mistralai/tekken": 0.7921538352966309,
"tiktoken/gpt-4o": 0.7901878356933594,
"tokenmonster/englishcode-32000-consistent-v1": 0.7765400409698486
}
|
{
"CohereLabs/aya-expanse-8b": 0.08761397004127502,
"Qwen/Qwen3-8B": 0.14105814695358276,
"bigscience/bloom": 0.1934700310230255,
"common-pile/comma-v0.1-1t": 0.09907892346382141,
"facebook/xglm-564M": 0.08311900496482849,
"google-bert/bert-base-multilingual-cased": 0.13695181906223297,
"google/byt5-small": 0.14798158407211304,
"google/gemma-2-2b": 0.01552680041640997,
"gpt2": 0.14244207739830017,
"meta-llama/Llama-3.2-1B": 0.14876556396484375,
"microsoft/Phi-3-mini-4k-instruct": 0.1234123557806015,
"mistralai/tekken": 0.08589836210012436,
"tiktoken/gpt-4o": 0.07952668517827988,
"tokenmonster/englishcode-32000-consistent-v1": 0.026012370362877846
}
|
{
"CohereLabs/aya-expanse-8b": 5,
"Qwen/Qwen3-8B": 5,
"bigscience/bloom": 5,
"common-pile/comma-v0.1-1t": 6,
"facebook/xglm-564M": 5,
"google-bert/bert-base-multilingual-cased": 5,
"google/byt5-small": 20,
"google/gemma-2-2b": 5,
"gpt2": 5,
"meta-llama/Llama-3.2-1B": 5,
"microsoft/Phi-3-mini-4k-instruct": 5,
"mistralai/tekken": 5,
"tiktoken/gpt-4o": 5,
"tokenmonster/englishcode-32000-consistent-v1": 6
}
|
||
The capital of JP is
|
[
"Kyoto",
"Osaka",
"Hiroshima",
"Tokyo"
] | 3 |
D
|
test
|
Abbreviations
|
Social Media & Informal Text
|
eng_Latn
|
328-0.2
| 328 | 0.2 |
{
"CohereLabs/aya-expanse-8b": 0.838476300239563,
"Qwen/Qwen3-8B": 0.8522142171859741,
"bigscience/bloom": 0.8232855200767517,
"common-pile/comma-v0.1-1t": 0.8368542194366455,
"facebook/xglm-564M": 0.8285537362098694,
"google-bert/bert-base-multilingual-cased": 0.8010603785514832,
"google/byt5-small": 0.9203615784645081,
"google/gemma-2-2b": 0.8251798152923584,
"gpt2": 0.8022357225418091,
"meta-llama/Llama-3.2-1B": 0.8143433332443237,
"microsoft/Phi-3-mini-4k-instruct": 0.7175941467285156,
"mistralai/tekken": 0.8012662529945374,
"tiktoken/gpt-4o": 0.8201245069503784,
"tokenmonster/englishcode-32000-consistent-v1": 0.8073024749755859
}
|
{
"CohereLabs/aya-expanse-8b": 0.17153622210025787,
"Qwen/Qwen3-8B": 0.12307695299386978,
"bigscience/bloom": 0.08853715658187866,
"common-pile/comma-v0.1-1t": 0.11716070026159286,
"facebook/xglm-564M": 0.15991993248462677,
"google-bert/bert-base-multilingual-cased": 0.08823803812265396,
"google/byt5-small": 0.2070769965648651,
"google/gemma-2-2b": 0.06976078450679779,
"gpt2": 0.08312174677848816,
"meta-llama/Llama-3.2-1B": 0.12662671506404877,
"microsoft/Phi-3-mini-4k-instruct": 0.13243405520915985,
"mistralai/tekken": 0.06049402803182602,
"tiktoken/gpt-4o": 0.14071625471115112,
"tokenmonster/englishcode-32000-consistent-v1": 0.10549987852573395
}
|
{
"CohereLabs/aya-expanse-8b": 5,
"Qwen/Qwen3-8B": 5,
"bigscience/bloom": 5,
"common-pile/comma-v0.1-1t": 6,
"facebook/xglm-564M": 5,
"google-bert/bert-base-multilingual-cased": 5,
"google/byt5-small": 20,
"google/gemma-2-2b": 5,
"gpt2": 5,
"meta-llama/Llama-3.2-1B": 5,
"microsoft/Phi-3-mini-4k-instruct": 6,
"mistralai/tekken": 5,
"tiktoken/gpt-4o": 5,
"tokenmonster/englishcode-32000-consistent-v1": 6
}
|
||
The capital of TR is
|
[
"İstanbul",
"İzmir",
"Bursa",
"Ankara"
] | 3 |
D
|
test
|
Abbreviations
|
Social Media & Informal Text
|
eng_Latn
|
329-0.2
| 329 | 0.2 |
{
"CohereLabs/aya-expanse-8b": 0.8134791254997253,
"Qwen/Qwen3-8B": 0.8371235132217407,
"bigscience/bloom": 0.8147814869880676,
"common-pile/comma-v0.1-1t": 0.8117464184761047,
"facebook/xglm-564M": 0.8058038949966431,
"google-bert/bert-base-multilingual-cased": 0.807705283164978,
"google/byt5-small": 0.8975408673286438,
"google/gemma-2-2b": 0.8226367831230164,
"gpt2": 0.7932828664779663,
"meta-llama/Llama-3.2-1B": 0.8048302531242371,
"microsoft/Phi-3-mini-4k-instruct": 0.7885987758636475,
"mistralai/tekken": 0.8118667602539062,
"tiktoken/gpt-4o": 0.7969956994056702,
"tokenmonster/englishcode-32000-consistent-v1": 0.7851678729057312
}
|
{
"CohereLabs/aya-expanse-8b": 0.07706472277641296,
"Qwen/Qwen3-8B": 0.09242145717144012,
"bigscience/bloom": 0.11954235285520554,
"common-pile/comma-v0.1-1t": 0.0688096210360527,
"facebook/xglm-564M": 0.056273337453603745,
"google-bert/bert-base-multilingual-cased": 0.184139221906662,
"google/byt5-small": 0.12901875376701355,
"google/gemma-2-2b": 0.11601769179105759,
"gpt2": 0.07733984291553497,
"meta-llama/Llama-3.2-1B": 0.14307504892349243,
"microsoft/Phi-3-mini-4k-instruct": -0.00042920373380184174,
"mistralai/tekken": 0.1253935694694519,
"tiktoken/gpt-4o": 0.04226153716444969,
"tokenmonster/englishcode-32000-consistent-v1": -0.0061051626689732075
}
|
{
"CohereLabs/aya-expanse-8b": 5,
"Qwen/Qwen3-8B": 5,
"bigscience/bloom": 5,
"common-pile/comma-v0.1-1t": 6,
"facebook/xglm-564M": 5,
"google-bert/bert-base-multilingual-cased": 5,
"google/byt5-small": 20,
"google/gemma-2-2b": 5,
"gpt2": 5,
"meta-llama/Llama-3.2-1B": 5,
"microsoft/Phi-3-mini-4k-instruct": 5,
"mistralai/tekken": 5,
"tiktoken/gpt-4o": 5,
"tokenmonster/englishcode-32000-consistent-v1": 6
}
|
||
The chem. formula for water is
|
[
"CO2",
"NaCl",
"O2",
"H2O"
] | 3 |
D
|
test
|
Abbreviations
|
eng_Latn
|
330-0.20
| 330 | 0.2 |
{
"CohereLabs/aya-expanse-8b": 0.8213294744491577,
"Qwen/Qwen3-8B": 0.8713029026985168,
"bigscience/bloom": 0.8499932885169983,
"common-pile/comma-v0.1-1t": 0.8503109216690063,
"facebook/xglm-564M": 0.7378727793693542,
"google-bert/bert-base-multilingual-cased": 0.7587925791740417,
"google/byt5-small": 0.9591166973114014,
"google/gemma-2-2b": 0.8184010982513428,
"gpt2": 0.8338674306869507,
"meta-llama/Llama-3.2-1B": 0.819198727607727,
"microsoft/Phi-3-mini-4k-instruct": 0.8195061683654785,
"mistralai/tekken": 0.8313064575195312,
"tiktoken/gpt-4o": 0.836508572101593,
"tokenmonster/englishcode-32000-consistent-v1": 0.848869264125824
}
|
{
"CohereLabs/aya-expanse-8b": 0.1875496208667755,
"Qwen/Qwen3-8B": 0.21313846111297607,
"bigscience/bloom": 0.23347987234592438,
"common-pile/comma-v0.1-1t": 0.18952646851539612,
"facebook/xglm-564M": 0.021598808467388153,
"google-bert/bert-base-multilingual-cased": 0.016012798994779587,
"google/byt5-small": -0.02367701567709446,
"google/gemma-2-2b": 0.17226368188858032,
"gpt2": 0.20595847070217133,
"meta-llama/Llama-3.2-1B": 0.2124955654144287,
"microsoft/Phi-3-mini-4k-instruct": 0.15150737762451172,
"mistralai/tekken": 0.24085702002048492,
"tiktoken/gpt-4o": 0.16686344146728516,
"tokenmonster/englishcode-32000-consistent-v1": 0.14736664295196533
}
|
{
"CohereLabs/aya-expanse-8b": 7,
"Qwen/Qwen3-8B": 7,
"bigscience/bloom": 7,
"common-pile/comma-v0.1-1t": 7,
"facebook/xglm-564M": 8,
"google-bert/bert-base-multilingual-cased": 8,
"google/byt5-small": 30,
"google/gemma-2-2b": 7,
"gpt2": 7,
"meta-llama/Llama-3.2-1B": 7,
"microsoft/Phi-3-mini-4k-instruct": 7,
"mistralai/tekken": 7,
"tiktoken/gpt-4o": 7,
"tokenmonster/englishcode-32000-consistent-v1": 7
}
|
|||
The intent in "What time does the store close?" is
|
[
"purch",
"book",
"info",
"complain"
] | 2 |
C
|
test
|
Abbreviations
|
eng_Latn
|
331-0.1
| 331 | 0.1 |
{
"CohereLabs/aya-expanse-8b": 0.9999998807907104,
"Qwen/Qwen3-8B": 1.0000001192092896,
"bigscience/bloom": 0.9999998807907104,
"common-pile/comma-v0.1-1t": 1.0000001192092896,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1.0000001192092896,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 0.9999995827674866,
"microsoft/Phi-3-mini-4k-instruct": 0.9999997615814209,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 0.9999998807907104,
"tokenmonster/englishcode-32000-consistent-v1": 0.9999999403953552
}
|
{
"CohereLabs/aya-expanse-8b": 1.0000003576278687,
"Qwen/Qwen3-8B": 1.0000004768371582,
"bigscience/bloom": 1.000000238418579,
"common-pile/comma-v0.1-1t": 1.0000004768371582,
"facebook/xglm-564M": 1.000000238418579,
"google-bert/bert-base-multilingual-cased": 1.0000004768371582,
"google/byt5-small": 1.0000004768371582,
"google/gemma-2-2b": 1.0000003576278687,
"gpt2": 1.0000004768371582,
"meta-llama/Llama-3.2-1B": 1.0000003576278687,
"microsoft/Phi-3-mini-4k-instruct": 1.0000003576278687,
"mistralai/tekken": 1.0000004768371582,
"tiktoken/gpt-4o": 1.0000004768371582,
"tokenmonster/englishcode-32000-consistent-v1": 1.0000003576278687
}
|
{
"CohereLabs/aya-expanse-8b": 12,
"Qwen/Qwen3-8B": 12,
"bigscience/bloom": 12,
"common-pile/comma-v0.1-1t": 13,
"facebook/xglm-564M": 12,
"google-bert/bert-base-multilingual-cased": 13,
"google/byt5-small": 50,
"google/gemma-2-2b": 12,
"gpt2": 12,
"meta-llama/Llama-3.2-1B": 12,
"microsoft/Phi-3-mini-4k-instruct": 12,
"mistralai/tekken": 12,
"tiktoken/gpt-4o": 12,
"tokenmonster/englishcode-32000-consistent-v1": 11
}
|
|||
The largest mammal in the world is
|
[
"dolphin",
"blue whale",
"giraffe",
"bear"
] | 1 |
B
|
test
|
Abbreviations
|
eng_Latn
|
332-0.1
| 332 | 0.1 |
{
"CohereLabs/aya-expanse-8b": 0.9999999403953552,
"Qwen/Qwen3-8B": 0.9999997615814209,
"bigscience/bloom": 0.9999997019767761,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 0.9999997615814209,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 0.9999998211860657,
"google/gemma-2-2b": 0.9999999403953552,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 0.9999998211860657,
"microsoft/Phi-3-mini-4k-instruct": 0.9999998807907104,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 0.9999997615814209,
"tokenmonster/englishcode-32000-consistent-v1": 0.9999998807907104
}
|
{
"CohereLabs/aya-expanse-8b": 1.0000003576278687,
"Qwen/Qwen3-8B": 1.0000004768371582,
"bigscience/bloom": 1.000000238418579,
"common-pile/comma-v0.1-1t": 1.0000004768371582,
"facebook/xglm-564M": 1.000000238418579,
"google-bert/bert-base-multilingual-cased": 1.0000004768371582,
"google/byt5-small": 1.0000004768371582,
"google/gemma-2-2b": 1.0000003576278687,
"gpt2": 1.0000004768371582,
"meta-llama/Llama-3.2-1B": 1.0000003576278687,
"microsoft/Phi-3-mini-4k-instruct": 1.0000003576278687,
"mistralai/tekken": 1.0000004768371582,
"tiktoken/gpt-4o": 1.0000004768371582,
"tokenmonster/englishcode-32000-consistent-v1": 1.0000003576278687
}
|
{
"CohereLabs/aya-expanse-8b": 7,
"Qwen/Qwen3-8B": 8,
"bigscience/bloom": 8,
"common-pile/comma-v0.1-1t": 8,
"facebook/xglm-564M": 8,
"google-bert/bert-base-multilingual-cased": 8,
"google/byt5-small": 34,
"google/gemma-2-2b": 7,
"gpt2": 7,
"meta-llama/Llama-3.2-1B": 8,
"microsoft/Phi-3-mini-4k-instruct": 9,
"mistralai/tekken": 8,
"tiktoken/gpt-4o": 8,
"tokenmonster/englishcode-32000-consistent-v1": 7
}
|
|||
The unit of measurement for temperature in the International System is
|
[
"°C",
"°F",
"K",
"°R"
] | 2 |
C
|
test
|
Abbreviations
|
Social Media & Informal Text
|
eng_Latn
|
333-0.1
| 333 | 0.1 |
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1.0000001192092896,
"bigscience/bloom": 0.9999998807907104,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 0.9999998807907104,
"google-bert/bert-base-multilingual-cased": 0.9999998211860657,
"google/byt5-small": 1,
"google/gemma-2-2b": 0.9999999403953552,
"gpt2": 1.0000001192092896,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1.0000001192092896,
"tiktoken/gpt-4o": 1.0000001192092896,
"tokenmonster/englishcode-32000-consistent-v1": 0.9999999403953552
}
|
{
"CohereLabs/aya-expanse-8b": 1.0000003576278687,
"Qwen/Qwen3-8B": 1.0000004768371582,
"bigscience/bloom": 1.000000238418579,
"common-pile/comma-v0.1-1t": 1.0000004768371582,
"facebook/xglm-564M": 1.000000238418579,
"google-bert/bert-base-multilingual-cased": 1.0000004768371582,
"google/byt5-small": 1.0000004768371582,
"google/gemma-2-2b": 1.0000003576278687,
"gpt2": 1.0000004768371582,
"meta-llama/Llama-3.2-1B": 1.0000003576278687,
"microsoft/Phi-3-mini-4k-instruct": 1.0000003576278687,
"mistralai/tekken": 1.0000004768371582,
"tiktoken/gpt-4o": 1.0000004768371582,
"tokenmonster/englishcode-32000-consistent-v1": 1.000000238418579
}
|
{
"CohereLabs/aya-expanse-8b": 11,
"Qwen/Qwen3-8B": 11,
"bigscience/bloom": 11,
"common-pile/comma-v0.1-1t": 13,
"facebook/xglm-564M": 11,
"google-bert/bert-base-multilingual-cased": 11,
"google/byt5-small": 70,
"google/gemma-2-2b": 11,
"gpt2": 11,
"meta-llama/Llama-3.2-1B": 11,
"microsoft/Phi-3-mini-4k-instruct": 11,
"mistralai/tekken": 11,
"tiktoken/gpt-4o": 11,
"tokenmonster/englishcode-32000-consistent-v1": 10
}
|
||
The country whose space agency is NASA is
|
[
"RU",
"CN",
"JP",
"US"
] | 3 |
D
|
test
|
Abbreviations
|
eng_Latn
|
334-0.1
| 334 | 0.1 |
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 0.9999998807907104,
"bigscience/bloom": 0.9999997019767761,
"common-pile/comma-v0.1-1t": 0.9999997615814209,
"facebook/xglm-564M": 0.9999997615814209,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 0.9999998211860657,
"google/gemma-2-2b": 0.9999997615814209,
"gpt2": 1.0000001192092896,
"meta-llama/Llama-3.2-1B": 0.9999998807907104,
"microsoft/Phi-3-mini-4k-instruct": 1.000000238418579,
"mistralai/tekken": 0.9999996423721313,
"tiktoken/gpt-4o": 0.9999998807907104,
"tokenmonster/englishcode-32000-consistent-v1": 1.0000001192092896
}
|
{
"CohereLabs/aya-expanse-8b": 1.0000003576278687,
"Qwen/Qwen3-8B": 1.0000004768371582,
"bigscience/bloom": 1.000000238418579,
"common-pile/comma-v0.1-1t": 1.0000004768371582,
"facebook/xglm-564M": 1.000000238418579,
"google-bert/bert-base-multilingual-cased": 1.0000004768371582,
"google/byt5-small": 1.0000004768371582,
"google/gemma-2-2b": 1.0000003576278687,
"gpt2": 1.0000004768371582,
"meta-llama/Llama-3.2-1B": 1.0000003576278687,
"microsoft/Phi-3-mini-4k-instruct": 1.0000003576278687,
"mistralai/tekken": 1.0000004768371582,
"tiktoken/gpt-4o": 1.0000004768371582,
"tokenmonster/englishcode-32000-consistent-v1": 1.0000003576278687
}
|
{
"CohereLabs/aya-expanse-8b": 8,
"Qwen/Qwen3-8B": 8,
"bigscience/bloom": 8,
"common-pile/comma-v0.1-1t": 8,
"facebook/xglm-564M": 8,
"google-bert/bert-base-multilingual-cased": 8,
"google/byt5-small": 41,
"google/gemma-2-2b": 8,
"gpt2": 8,
"meta-llama/Llama-3.2-1B": 8,
"microsoft/Phi-3-mini-4k-instruct": 9,
"mistralai/tekken": 8,
"tiktoken/gpt-4o": 8,
"tokenmonster/englishcode-32000-consistent-v1": 9
}
|
|||
The lang. spoken in Brazil is
|
[
"Portuguese",
"Spanish",
"French",
"Italian"
] | 0 |
A
|
test
|
Abbreviations
|
Social Media & Informal Text
|
eng_Latn
|
335-0.2
| 335 | 0.2 |
{
"CohereLabs/aya-expanse-8b": 0.7906010746955872,
"Qwen/Qwen3-8B": 0.8113056421279907,
"bigscience/bloom": 0.8049988746643066,
"common-pile/comma-v0.1-1t": 0.8325293064117432,
"facebook/xglm-564M": 0.7717224359512329,
"google-bert/bert-base-multilingual-cased": 0.7581318616867065,
"google/byt5-small": 0.9475886821746826,
"google/gemma-2-2b": 0.7959599494934082,
"gpt2": 0.788682222366333,
"meta-llama/Llama-3.2-1B": 0.8063963055610657,
"microsoft/Phi-3-mini-4k-instruct": 0.7926681041717529,
"mistralai/tekken": 0.7794783711433411,
"tiktoken/gpt-4o": 0.8209499716758728,
"tokenmonster/englishcode-32000-consistent-v1": 0.7077274322509766
}
|
{
"CohereLabs/aya-expanse-8b": 0.10672236233949661,
"Qwen/Qwen3-8B": 0.07510721683502197,
"bigscience/bloom": 0.12772142887115479,
"common-pile/comma-v0.1-1t": 0.06116257607936859,
"facebook/xglm-564M": 0.09400343149900436,
"google-bert/bert-base-multilingual-cased": 0.024091584607958794,
"google/byt5-small": -0.025367457419633865,
"google/gemma-2-2b": 0.14458352327346802,
"gpt2": 0.10195167362689972,
"meta-llama/Llama-3.2-1B": 0.11951098591089249,
"microsoft/Phi-3-mini-4k-instruct": 0.04442334175109863,
"mistralai/tekken": 0.07316195219755173,
"tiktoken/gpt-4o": 0.16807381808757782,
"tokenmonster/englishcode-32000-consistent-v1": 0.16010284423828125
}
|
{
"CohereLabs/aya-expanse-8b": 7,
"Qwen/Qwen3-8B": 7,
"bigscience/bloom": 7,
"common-pile/comma-v0.1-1t": 8,
"facebook/xglm-564M": 7,
"google-bert/bert-base-multilingual-cased": 7,
"google/byt5-small": 29,
"google/gemma-2-2b": 7,
"gpt2": 7,
"meta-llama/Llama-3.2-1B": 7,
"microsoft/Phi-3-mini-4k-instruct": 7,
"mistralai/tekken": 7,
"tiktoken/gpt-4o": 7,
"tokenmonster/englishcode-32000-consistent-v1": 7
}
|
||
The metal with chemical sym. 'Fe' is
|
[
"lead",
"iron",
"zinc",
"gold"
] | 1 |
B
|
test
|
Abbreviations
|
Social Media & Informal Text
|
eng_Latn
|
336-0.1
| 336 | 0.1 |
{
"CohereLabs/aya-expanse-8b": 0.9015272855758667,
"Qwen/Qwen3-8B": 0.9102910757064819,
"bigscience/bloom": 0.8799982070922852,
"common-pile/comma-v0.1-1t": 0.9248884916305542,
"facebook/xglm-564M": 0.8538807034492493,
"google-bert/bert-base-multilingual-cased": 0.8467434644699097,
"google/byt5-small": 0.9709159731864929,
"google/gemma-2-2b": 0.9017316699028015,
"gpt2": 0.8987561464309692,
"meta-llama/Llama-3.2-1B": 0.9013137221336365,
"microsoft/Phi-3-mini-4k-instruct": 0.8870991468429565,
"mistralai/tekken": 0.8897440433502197,
"tiktoken/gpt-4o": 0.913067102432251,
"tokenmonster/englishcode-32000-consistent-v1": 0.868256688117981
}
|
{
"CohereLabs/aya-expanse-8b": 0.1325775533914566,
"Qwen/Qwen3-8B": 0.10851742327213287,
"bigscience/bloom": 0.05709414184093475,
"common-pile/comma-v0.1-1t": 0.03906556963920593,
"facebook/xglm-564M": 0.06293395161628723,
"google-bert/bert-base-multilingual-cased": 0.03629113733768463,
"google/byt5-small": -0.034767430275678635,
"google/gemma-2-2b": 0.10764612257480621,
"gpt2": 0.08282459527254105,
"meta-llama/Llama-3.2-1B": 0.1575041115283966,
"microsoft/Phi-3-mini-4k-instruct": 0.050431989133358,
"mistralai/tekken": 0.10146505385637283,
"tiktoken/gpt-4o": 0.2269553542137146,
"tokenmonster/englishcode-32000-consistent-v1": -0.060447804629802704
}
|
{
"CohereLabs/aya-expanse-8b": 10,
"Qwen/Qwen3-8B": 10,
"bigscience/bloom": 10,
"common-pile/comma-v0.1-1t": 15,
"facebook/xglm-564M": 11,
"google-bert/bert-base-multilingual-cased": 11,
"google/byt5-small": 36,
"google/gemma-2-2b": 10,
"gpt2": 10,
"meta-llama/Llama-3.2-1B": 10,
"microsoft/Phi-3-mini-4k-instruct": 10,
"mistralai/tekken": 10,
"tiktoken/gpt-4o": 10,
"tokenmonster/englishcode-32000-consistent-v1": 10
}
|
||
The planet closest to the Sun in our solar system is
|
[
"☿",
"♀",
"♂",
"♁"
] | 0 |
A
|
test
|
Abbreviations
|
Social Media & Informal Text
|
eng_Latn
|
338-0.1
| 338 | 0.1 |
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 0.9999999403953552,
"common-pile/comma-v0.1-1t": 1.0000001192092896,
"facebook/xglm-564M": 0.9999999403953552,
"google-bert/bert-base-multilingual-cased": 0.9999997019767761,
"google/byt5-small": 1,
"google/gemma-2-2b": 0.9999999403953552,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 0.9999998807907104,
"microsoft/Phi-3-mini-4k-instruct": 0.9999999403953552,
"mistralai/tekken": 0.9999999403953552,
"tiktoken/gpt-4o": 1.0000001192092896,
"tokenmonster/englishcode-32000-consistent-v1": 0.9999998211860657
}
|
{
"CohereLabs/aya-expanse-8b": 1.0000003576278687,
"Qwen/Qwen3-8B": 1.0000004768371582,
"bigscience/bloom": 1.000000238418579,
"common-pile/comma-v0.1-1t": 1.0000004768371582,
"facebook/xglm-564M": 1.000000238418579,
"google-bert/bert-base-multilingual-cased": 1.0000004768371582,
"google/byt5-small": 1.0000004768371582,
"google/gemma-2-2b": 1.0000003576278687,
"gpt2": 1.0000004768371582,
"meta-llama/Llama-3.2-1B": 1.0000003576278687,
"microsoft/Phi-3-mini-4k-instruct": 1.0000003576278687,
"mistralai/tekken": 1.0000004768371582,
"tiktoken/gpt-4o": 1.0000004768371582,
"tokenmonster/englishcode-32000-consistent-v1": 1.000000238418579
}
|
{
"CohereLabs/aya-expanse-8b": 11,
"Qwen/Qwen3-8B": 11,
"bigscience/bloom": 11,
"common-pile/comma-v0.1-1t": 14,
"facebook/xglm-564M": 12,
"google-bert/bert-base-multilingual-cased": 11,
"google/byt5-small": 52,
"google/gemma-2-2b": 11,
"gpt2": 11,
"meta-llama/Llama-3.2-1B": 11,
"microsoft/Phi-3-mini-4k-instruct": 11,
"mistralai/tekken": 11,
"tiktoken/gpt-4o": 11,
"tokenmonster/englishcode-32000-consistent-v1": 8
}
|
||
The largest planet in the Solar System is
|
[
"♁",
"♄",
"♃",
"♂"
] | 2 |
C
|
test
|
Abbreviations
|
Social Media & Informal Text
|
eng_Latn
|
339-0.1
| 339 | 0.1 |
{
"CohereLabs/aya-expanse-8b": 0.9999998807907104,
"Qwen/Qwen3-8B": 0.9999998211860657,
"bigscience/bloom": 0.9999997615814209,
"common-pile/comma-v0.1-1t": 1.0000001192092896,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 0.9999999403953552,
"google/byt5-small": 1.000000238418579,
"google/gemma-2-2b": 1,
"gpt2": 0.9999996423721313,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 0.9999998211860657,
"tiktoken/gpt-4o": 0.9999998211860657,
"tokenmonster/englishcode-32000-consistent-v1": 0.9999998211860657
}
|
{
"CohereLabs/aya-expanse-8b": 1.0000003576278687,
"Qwen/Qwen3-8B": 1.0000004768371582,
"bigscience/bloom": 1.000000238418579,
"common-pile/comma-v0.1-1t": 1.0000004768371582,
"facebook/xglm-564M": 1.000000238418579,
"google-bert/bert-base-multilingual-cased": 1.0000004768371582,
"google/byt5-small": 1.0000004768371582,
"google/gemma-2-2b": 1.0000003576278687,
"gpt2": 1.0000004768371582,
"meta-llama/Llama-3.2-1B": 1.0000003576278687,
"microsoft/Phi-3-mini-4k-instruct": 1.0000003576278687,
"mistralai/tekken": 1.0000004768371582,
"tiktoken/gpt-4o": 1.0000004768371582,
"tokenmonster/englishcode-32000-consistent-v1": 1.000000238418579
}
|
{
"CohereLabs/aya-expanse-8b": 8,
"Qwen/Qwen3-8B": 8,
"bigscience/bloom": 8,
"common-pile/comma-v0.1-1t": 9,
"facebook/xglm-564M": 8,
"google-bert/bert-base-multilingual-cased": 8,
"google/byt5-small": 41,
"google/gemma-2-2b": 8,
"gpt2": 8,
"meta-llama/Llama-3.2-1B": 8,
"microsoft/Phi-3-mini-4k-instruct": 9,
"mistralai/tekken": 8,
"tiktoken/gpt-4o": 8,
"tokenmonster/englishcode-32000-consistent-v1": 8
}
|
||
The process that allows plants to produce their own food using sunlight is
|
[
"resp.",
"dig.",
"ferm.",
"photo."
] | 3 |
D
|
test
|
Abbreviations
|
eng_Latn
|
340-0.1
| 340 | 0.1 |
{
"CohereLabs/aya-expanse-8b": 1.0000001192092896,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1.0000001192092896,
"common-pile/comma-v0.1-1t": 1.0000001192092896,
"facebook/xglm-564M": 0.9999998807907104,
"google-bert/bert-base-multilingual-cased": 0.9999999403953552,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 0.9999998807907104,
"meta-llama/Llama-3.2-1B": 0.9999999403953552,
"microsoft/Phi-3-mini-4k-instruct": 0.9999999403953552,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1.0000001192092896,
"tokenmonster/englishcode-32000-consistent-v1": 0.9999998807907104
}
|
{
"CohereLabs/aya-expanse-8b": 1.0000003576278687,
"Qwen/Qwen3-8B": 1.0000004768371582,
"bigscience/bloom": 1.000000238418579,
"common-pile/comma-v0.1-1t": 1.0000004768371582,
"facebook/xglm-564M": 1.000000238418579,
"google-bert/bert-base-multilingual-cased": 1.0000004768371582,
"google/byt5-small": 1.0000004768371582,
"google/gemma-2-2b": 1.0000003576278687,
"gpt2": 1.0000004768371582,
"meta-llama/Llama-3.2-1B": 1.0000003576278687,
"microsoft/Phi-3-mini-4k-instruct": 1.0000003576278687,
"mistralai/tekken": 1.0000004768371582,
"tiktoken/gpt-4o": 1.0000004768371582,
"tokenmonster/englishcode-32000-consistent-v1": 1.0000003576278687
}
|
{
"CohereLabs/aya-expanse-8b": 13,
"Qwen/Qwen3-8B": 13,
"bigscience/bloom": 13,
"common-pile/comma-v0.1-1t": 15,
"facebook/xglm-564M": 14,
"google-bert/bert-base-multilingual-cased": 14,
"google/byt5-small": 74,
"google/gemma-2-2b": 13,
"gpt2": 13,
"meta-llama/Llama-3.2-1B": 13,
"microsoft/Phi-3-mini-4k-instruct": 14,
"mistralai/tekken": 13,
"tiktoken/gpt-4o": 13,
"tokenmonster/englishcode-32000-consistent-v1": 10
}
|
|||
The author who wrote the play "Romeo and Juliet" is
|
[
"C. Dickens",
"M. Twain",
"W. Shakespeare",
"J. Austen"
] | 2 |
C
|
test
|
Abbreviations
|
eng_Latn
|
341-0.1
| 341 | 0.1 |
{
"CohereLabs/aya-expanse-8b": 1.0000001192092896,
"Qwen/Qwen3-8B": 1.0000001192092896,
"bigscience/bloom": 1.0000001192092896,
"common-pile/comma-v0.1-1t": 1.0000001192092896,
"facebook/xglm-564M": 1.0000001192092896,
"google-bert/bert-base-multilingual-cased": 0.9999998211860657,
"google/byt5-small": 1.0000001192092896,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 0.9999999403953552,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1.000000238418579,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1.0000003576278687,
"Qwen/Qwen3-8B": 1.0000004768371582,
"bigscience/bloom": 1.000000238418579,
"common-pile/comma-v0.1-1t": 1.0000004768371582,
"facebook/xglm-564M": 1.000000238418579,
"google-bert/bert-base-multilingual-cased": 1.0000004768371582,
"google/byt5-small": 1.0000004768371582,
"google/gemma-2-2b": 1.0000003576278687,
"gpt2": 1.0000004768371582,
"meta-llama/Llama-3.2-1B": 1.0000003576278687,
"microsoft/Phi-3-mini-4k-instruct": 1.0000003576278687,
"mistralai/tekken": 1.0000004768371582,
"tiktoken/gpt-4o": 1.0000004768371582,
"tokenmonster/englishcode-32000-consistent-v1": 1.0000003576278687
}
|
{
"CohereLabs/aya-expanse-8b": 12,
"Qwen/Qwen3-8B": 14,
"bigscience/bloom": 13,
"common-pile/comma-v0.1-1t": 17,
"facebook/xglm-564M": 14,
"google-bert/bert-base-multilingual-cased": 12,
"google/byt5-small": 51,
"google/gemma-2-2b": 12,
"gpt2": 14,
"meta-llama/Llama-3.2-1B": 14,
"microsoft/Phi-3-mini-4k-instruct": 15,
"mistralai/tekken": 13,
"tiktoken/gpt-4o": 13,
"tokenmonster/englishcode-32000-consistent-v1": 14
}
|
|||
What plants need from the air to make food is
|
[
"N₂",
"H₂",
"He",
"CO₂"
] | 3 |
D
|
test
|
Abbreviations
|
eng_Latn
|
343-0.1
| 343 | 0.1 |
{
"CohereLabs/aya-expanse-8b": 1.000000238418579,
"Qwen/Qwen3-8B": 0.9999999403953552,
"bigscience/bloom": 1.0000001192092896,
"common-pile/comma-v0.1-1t": 0.9999998211860657,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 0.9999998807907104,
"google/byt5-small": 1,
"google/gemma-2-2b": 0.9999997615814209,
"gpt2": 1.0000001192092896,
"meta-llama/Llama-3.2-1B": 1.0000001192092896,
"microsoft/Phi-3-mini-4k-instruct": 0.9999998807907104,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 0.9999998211860657,
"tokenmonster/englishcode-32000-consistent-v1": 0.9999998807907104
}
|
{
"CohereLabs/aya-expanse-8b": 1.0000003576278687,
"Qwen/Qwen3-8B": 1.0000004768371582,
"bigscience/bloom": 1.000000238418579,
"common-pile/comma-v0.1-1t": 1.0000004768371582,
"facebook/xglm-564M": 1.000000238418579,
"google-bert/bert-base-multilingual-cased": 1.0000004768371582,
"google/byt5-small": 1.0000004768371582,
"google/gemma-2-2b": 1.0000003576278687,
"gpt2": 1.0000004768371582,
"meta-llama/Llama-3.2-1B": 1.0000003576278687,
"microsoft/Phi-3-mini-4k-instruct": 1.0000003576278687,
"mistralai/tekken": 1.0000004768371582,
"tiktoken/gpt-4o": 1.0000004768371582,
"tokenmonster/englishcode-32000-consistent-v1": 1.0000003576278687
}
|
{
"CohereLabs/aya-expanse-8b": 10,
"Qwen/Qwen3-8B": 10,
"bigscience/bloom": 10,
"common-pile/comma-v0.1-1t": 12,
"facebook/xglm-564M": 10,
"google-bert/bert-base-multilingual-cased": 10,
"google/byt5-small": 45,
"google/gemma-2-2b": 10,
"gpt2": 10,
"meta-llama/Llama-3.2-1B": 10,
"microsoft/Phi-3-mini-4k-instruct": 10,
"mistralai/tekken": 10,
"tiktoken/gpt-4o": 10,
"tokenmonster/englishcode-32000-consistent-v1": 8
}
|
|||
In "Can you pls. book a flight to Paris?", the person wants to
|
[
"go shopping",
"file a complaint",
"make a booking",
"cancel reservation"
] | 2 |
C
|
test
|
Abbreviations
|
Social Media & Informal Text
|
eng_Latn
|
344-0.11
| 344 | 0.11 |
{
"CohereLabs/aya-expanse-8b": 0.8953720927238464,
"Qwen/Qwen3-8B": 0.9513365030288696,
"bigscience/bloom": 0.9068660140037537,
"common-pile/comma-v0.1-1t": 0.958304226398468,
"facebook/xglm-564M": 0.9203377366065979,
"google-bert/bert-base-multilingual-cased": 0.9147680997848511,
"google/byt5-small": 0.9851800203323364,
"google/gemma-2-2b": 0.9473385214805603,
"gpt2": 0.8919826149940491,
"meta-llama/Llama-3.2-1B": 0.9390807747840881,
"microsoft/Phi-3-mini-4k-instruct": 0.9023863077163696,
"mistralai/tekken": 0.884173572063446,
"tiktoken/gpt-4o": 0.9475116729736328,
"tokenmonster/englishcode-32000-consistent-v1": 0.8966159820556641
}
|
{
"CohereLabs/aya-expanse-8b": 0.012544890865683556,
"Qwen/Qwen3-8B": 0.1350504606962204,
"bigscience/bloom": 0.01425527036190033,
"common-pile/comma-v0.1-1t": 0.13568083941936493,
"facebook/xglm-564M": 0.06789200007915497,
"google-bert/bert-base-multilingual-cased": 0.08616456389427185,
"google/byt5-small": 0.30707496404647827,
"google/gemma-2-2b": 0.1704685240983963,
"gpt2": 0.0131788719445467,
"meta-llama/Llama-3.2-1B": 0.140251025557518,
"microsoft/Phi-3-mini-4k-instruct": 0.04106447473168373,
"mistralai/tekken": 0.01378481462597847,
"tiktoken/gpt-4o": 0.10346754640340805,
"tokenmonster/englishcode-32000-consistent-v1": 0.07150956243276596
}
|
{
"CohereLabs/aya-expanse-8b": 17,
"Qwen/Qwen3-8B": 16,
"bigscience/bloom": 18,
"common-pile/comma-v0.1-1t": 20,
"facebook/xglm-564M": 18,
"google-bert/bert-base-multilingual-cased": 18,
"google/byt5-small": 62,
"google/gemma-2-2b": 16,
"gpt2": 17,
"meta-llama/Llama-3.2-1B": 16,
"microsoft/Phi-3-mini-4k-instruct": 18,
"mistralai/tekken": 17,
"tiktoken/gpt-4o": 16,
"tokenmonster/englishcode-32000-consistent-v1": 14
}
|
Dataset Card for Tokenization Robustness
A comprehensive evaluation dataset for testing robustness of different tokenization strategies.
Dataset Details
Dataset Description
This dataset evaluates how robust language models are to different tokenization strategies and edge cases. It includes text completion questions with multiple choice answers designed to test various aspects of tokenization handling.
- Curated by: R3
- Funded by [optional]: [More Information Needed]
- Shared by [optional]: [More Information Needed]
- Language(s) (NLP): [More Information Needed]
- License: cc
Dataset Sources [optional]
- Repository: [More Information Needed]
- Paper [optional]: [More Information Needed]
- Demo [optional]: [More Information Needed]
Uses
Direct Use
[More Information Needed]
Out-of-Scope Use
[More Information Needed]
Dataset Structure
The dataset contains multiple-choice questions with associated metadata about tokenization types and categories.
Dataset Creation
Curation Rationale
[More Information Needed]
Source Data
Data Collection and Processing
[More Information Needed]
Who are the source data producers?
[More Information Needed]
Annotations [optional]
Annotation process
[More Information Needed]
Who are the annotators?
[More Information Needed]
Personal and Sensitive Information
[More Information Needed]
Bias, Risks, and Limitations
The dataset focuses primarily on English text and may not generalize to other languages or tokenization schemes not covered in the evaluation.
Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
Citation [optional]
BibTeX:
[More Information Needed]
APA:
[More Information Needed]
Glossary [optional]
[More Information Needed]
More Information [optional]
[More Information Needed]
Dataset Card Authors [optional]
[More Information Needed]
Dataset Card Contact
[More Information Needed]
- Downloads last month
- 709