Prompto: An open source library for asynchronous querying of LLM endpoints

Salvato in:
Dettagli Bibliografici
Pubblicato in:arXiv.org (Dec 16, 2024), p. n/a
Autore principale: Chan, Ryan Sze-Yin
Altri autori: Nanni, Federico, Williams, Angus R, Brown, Edwin, Burke-Moore, Liam, Chapman, Ed, Onslow, Kate, Sippy, Tvesha, Bright, Jonathan, Gabasova, Evelina
Pubblicazione:
Cornell University Library, arXiv.org
Soggetti:
Accesso online:Citation/Abstract
Full text outside of ProQuest
Tags: Aggiungi Tag
Nessun Tag, puoi essere il primo ad aggiungerne!!
Descrizione
Abstract:Recent surge in Large Language Model (LLM) availability has opened exciting avenues for research. However, efficiently interacting with these models presents a significant hurdle since LLMs often reside on proprietary or self-hosted API endpoints, each requiring custom code for interaction. Conducting comparative studies between different models can therefore be time-consuming and necessitate significant engineering effort, hindering research efficiency and reproducibility. To address these challenges, we present prompto, an open source Python library which facilitates asynchronous querying of LLM endpoints enabling researchers to interact with multiple LLMs concurrently, while maximising efficiency and utilising individual rate limits. Our library empowers researchers and developers to interact with LLMs more effectively and allowing faster experimentation, data generation and evaluation. prompto is released with an introductory video (https://youtu.be/lWN9hXBOLyQ) under MIT License and is available via GitHub (https://github.com/alan-turing-institute/prompto).
ISSN:2331-8422
Fonte:Engineering Database