fabiancpl's picture
Pushing model, tokenizer and model card
dff58db
---
library_name: transformers
license: mit
language:
- en
metrics:
- accuracy
- perplexity
base_model:
- bert-large-cased
pipeline_tag: fill-mask
---
# BERT large for filling user actions in requirement specifications
This model fills masks ([MASK]) in requirements specifications. During the fine-tuning process, POS verbs were used as a proxy of user actions.
- **Developed by:** Fabian C. Peña, Steffen Herbold
- **Finetuned from:** [bert-large-cased](https://huggingface.co/bert-large-cased)
- **Replication kit:** [https://github.com/aieng-lab/senlp-benchmark](https://github.com/aieng-lab/senlp-benchmark)
- **Language:** English
- **License:** MIT
## Citation
```
@misc{pena2025benchmark,
author = {Fabian Peña and Steffen Herbold},
title = {Evaluating Large Language Models on Non-Code Software Engineering Tasks},
year = {2025}
}
```