A desktop app for local AI experimentation, model inference hosting, and note-taking.
It's made to be used alongside https://github.com/alexanderatallah/window.ai/ as a simple way to have a local inference server up and running in no time. window.ai + local.ai enable every web app to utilize AI without incurring any cost from either the developer or the user!
Right now, local.ai uses the https://github.com/rustformers/llm rust crate at its core. Check them out, they are super cool!
local.ai.demo.v0.2.x.mp4
Here's how to run the project locally:
- node >= 18
- rust >= 1.69
- pnpm >= 8
pnpm i
pnpm dev
- Start as many inference endpoints/ports as needed
Code signing, official binary releaseAuto update serverLLM model downloaderWebsite with download links(NTH): Automated release bundling
- NTH: Nice to have
item: Done
Ties into the bring your own model concept -- Alex from window.ai
Anything AI-related including their derivatives should be open-source for all to inspect. GPLv3 enforces this chain of open-source.