Open-source observability, prompt management & evaluations for LLMs
Lunary helps LLM developers take their apps to the next level.
- π΅ Analytics (cost, token, latency, ..)
- π Monitoring (logs, traces, user tracking, ..)
- β©οΈ Prompt Templates (versioning, team collaboration, ..)
- π·οΈ Create fine-tuning datasets
- π²οΈ Chat & feedback tracking
- π§ͺ Evaluations
It also designed to be:
- π€ Usable with any model, not just OpenAI
- π¦ Easy to integrate (2 minutes)
- π§βπ» Self-hostable
demo720.mp4
Modules available for:
Lunary natively supports:
- LangChain (JS & Python)
- OpenAI module
- LiteLLM
Additionally you can use it with any other LLM by manually sending events.
Full documentation is available on the website.
We offer a hosted version with a free plan of up to 10k requests / month.
With the hosted version:
- π· don't worry about devops or managing updates
- π get priority 1:1 support with our team
- πͺπΊ your data is stored safely in Europe
- Clone the repository
- Setup a PostgreSQL instance (version 15 minimum)
- Copy the content of
packages/backend/.env.example
topackages/backend/.env
and fill the missing values - Copy the content of
packages/frontend/.env.example
topackages/backend/.env
- Run
npm install
- Run
npm run migrate:db
- Run
npm run dev
You can now open the dashboard at http://localhost:8080
.
When using our JS or Python SDK, you need to set the environment variable LUNARY_API_URL
to http://localhost:3333
. You can use LUNARY_VERBOSE=True
to see all the event sent by the SDK
Need help or have questions? Chat with us on the website or email us: hello [at] lunary.ai. We're here to help every step of the way.
This project is licensed under the Apache 2.0 License.