XLM-Roberta Fine-Tuned on Tigrinya (MLM)

This model is a fine-tuned version of xlm-roberta-base for the Tigrinya language (ትግርኛ), trained with the Masked Language Modeling (MLM) objective. It uses a custom BPE tokenizer adapted to Tigrinya using FastText-informed embedding initialization.

🔧 Details

  • Base model: xlm-roberta-base
  • Language: Tigrinya
  • Tokenizer: Custom BPE tokenizer (non-morpheme-aware)
  • Adaptation: Embedding initialization using weighted averages of pretrained XLM-R embeddings, guided by Tigrinya FastText word vectors
  • Training dataset: Tigrinya side of the NLLB (No Language Left Behind) parallel corpus
  • Objective: Masked Language Modeling (MLM)

🧪 Usage

from transformers import AutoTokenizer, AutoModelForMaskedLM

tokenizer = AutoTokenizer.from_pretrained("Hailay/xlmr-tigriyna-mlm")
model = AutoModelForMaskedLM.from_pretrained("Hailay/xlmr-tigriyna-mlm")

text = "ትግራይ ብምትሕብባ ንህዝቢ ግብሪ ቀጺሉ።"
inputs = tokenizer(text, return_tensors="pt")
outputs = model(**inputs)
📌 Intended Use
Pretraining for Tigrinya NLP tasks

Fine-tuning on classification, NER, QA, and other downstream tasks in Tigrinya

Research in low-resource Semitic and morphologically rich languages

📖 Citation
@misc{hailay2025tigrinya,
  title={Tigrinya MLM with XLM-R and FastText-Informed Embedding Initialization},
  author={Hailay Kidu},
  year={2025},
  url={https://huggingface.co/Hailay/xlmr-tigriyna-mlm}
}
🏷️ License
Apache License 2.0
Downloads last month
4
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Free AI Image Generator No sign-up. Instant results. Open Now