add bibtex
Browse files
README.md
CHANGED
|
@@ -17,7 +17,7 @@ This model is based on the small OpenAI GPT-2 ([`gpt2`](https://huggingface.co/g
|
|
| 17 |
|
| 18 |
The Transformer layer weights in this model are identical to the original English, model but the lexical layer has been retrained for a Dutch vocabulary.
|
| 19 |
|
| 20 |
-
For details, check out our paper on [arXiv](https://arxiv.org/abs/
|
| 21 |
|
| 22 |
|
| 23 |
## Related models
|
|
@@ -52,4 +52,12 @@ model = TFAutoModel.from_pretrained("GroNLP/gpt2-small-dutch-embeddings") # Ten
|
|
| 52 |
## BibTeX entry
|
| 53 |
|
| 54 |
```bibtex
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 55 |
```
|
|
|
|
| 17 |
|
| 18 |
The Transformer layer weights in this model are identical to the original English, model but the lexical layer has been retrained for a Dutch vocabulary.
|
| 19 |
|
| 20 |
+
For details, check out our paper on [arXiv](https://arxiv.org/abs/2012.05628) and the code on [Github](https://github.com/wietsedv/gpt2-recycle).
|
| 21 |
|
| 22 |
|
| 23 |
## Related models
|
|
|
|
| 52 |
## BibTeX entry
|
| 53 |
|
| 54 |
```bibtex
|
| 55 |
+
@misc{devries2020good,
|
| 56 |
+
title={As good as new. How to successfully recycle English GPT-2 to make models for other languages},
|
| 57 |
+
author={Wietse de Vries and Malvina Nissim},
|
| 58 |
+
year={2020},
|
| 59 |
+
eprint={2012.05628},
|
| 60 |
+
archivePrefix={arXiv},
|
| 61 |
+
primaryClass={cs.CL}
|
| 62 |
+
}
|
| 63 |
```
|