MiniChain: A Small Library for Coding with Large Language Models

Alexander Rush


Abstract
Programming augmented by large language models (LLMs) opens up many new application areas, but also requires care. LLMs are accurate enough, on average, to replace core functionality, yet make basic mistakes that demonstrate a lack of robustness. An ecosystem of prompting tools, from intelligent agents to new programming languages, have emerged with different solutions for patching LLMs with other tools. In this work, we introduce MiniChain, an opinionated tool for LLM augmented programming, with the design goals of ease-of-use of prototyping, transparency through automatic visualization, and a minimalistic approach to advanced features. The MiniChain library provides core primitives for coding LLM calls, separating out prompt templates, and capturing program structure. The library includes demo implementations of the main applications papers in the area, including chat-bots, code generation, retrieval-based question answering, and complex information extraction. The library is open-source and available at https://github.com/srush/MiniChain, with code demos available at https://srush-minichain.hf.space/, and video demo at https://www.youtube.com/watch?v=VszZ1VnO7sk.
Anthology ID:
2023.emnlp-demo.27
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing: System Demonstrations
Month:
December
Year:
2023
Address:
Singapore
Editors:
Yansong Feng, Els Lefever
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
311–317
Language:
URL:
https://aclanthology.org/2023.emnlp-demo.27
DOI:
10.18653/v1/2023.emnlp-demo.27
Bibkey:
Cite (ACL):
Alexander Rush. 2023. MiniChain: A Small Library for Coding with Large Language Models. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, pages 311–317, Singapore. Association for Computational Linguistics.
Cite (Informal):
MiniChain: A Small Library for Coding with Large Language Models (Rush, EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-demo.27.pdf