|
--- |
|
library_name: transformers |
|
tags: [] |
|
--- |
|
|
|
# Model Card |
|
|
|
This model was created from [EleutherAI's Pythia-1.4B model](https://huggingface.co/EleutherAI/pythia-1.4b) by continued pretraining on the [BeanCounter dataset](https://huggingface.co/datasets/bradfordlevy/BeanCounter). Full details of the training process are available in [Wang and Levy (2024)](https://arxiv.org/abs/2409.17827). The model has not undergone any safety checks or alignment, thus it should be used for research purposes only. |
|
|
|
If you use this model in your work, please cite us: |
|
|
|
``` |
|
@inproceedings{ |
|
wang2024beancounter, |
|
title={BeanCounter: A low-toxicity, large-scale, and open dataset of business-oriented text}, |
|
author={Siyan Wang and Bradford Levy}, |
|
booktitle={The Thirty-eight Conference on Neural Information Processing Systems Datasets and Benchmarks Track}, |
|
year={2024}, |
|
url={https://openreview.net/forum?id=HV5JhUZGpP} |
|
} |
|
``` |
|
|
|
|