Users are encouraged to not only cite pangoling, but also the python package `transformers` (and the specific LLM they are using):
Nicenboim B (2025-04-07 17:00:02 UTC). pangoling: Access to large language model predictions in R. doi:10.5281/zenodo.7637526, R package version 1.0.3, https://github.com/ropensci/pangoling.
Wolf T, Debut L, Sanh V, Chaumond J, Delangue C, Moi A, Cistac P, Rault T, Louf R, Funtowicz M, Davison J, Shleifer S, von Platen P, Ma C, Jernite Y, Plu J, Xu C, Le Scao T, Gugger S, Drame M, Lhoest Q, Rush AM (2020). “HuggingFace's Transformers: State-of-the-art Natural Language Processing.” 1910.03771, https://arxiv.org/abs/1910.03771.
Corresponding BibTeX entries:
@Manual{,
title = {{pangoling}: {Access} to large language model predictions
in {R}},
author = {Bruno Nicenboim},
year = {2025-04-07 17:00:02 UTC},
note = {R package version 1.0.3},
doi = {10.5281/zenodo.7637526},
url = {https://github.com/ropensci/pangoling},
}
@Misc{,
title = {{HuggingFace's Transformers}: State-of-the-art Natural
Language Processing},
author = {Thomas Wolf and Lysandre Debut and Victor Sanh and Julien
Chaumond and Clement Delangue and Anthony Moi and Pierric Cistac
and Tim Rault and Rémi Louf and Morgan Funtowicz and Joe Davison
and Sam Shleifer and Patrick {von Platen} and Clara Ma and Yacine
Jernite and Julien Plu and Canwen Xu and Teven {Le Scao} and
Sylvain Gugger and Mariama Drame and Quentin Lhoest and Alexander
M. Rush},
year = {2020},
eprint = {1910.03771},
archiveprefix = {arXiv},
primaryclass = {cs.CL},
url = {https://arxiv.org/abs/1910.03771},
}