llamacpp-for-kobold, a zero dependency KoboldAI compatible REST API interfacing with llama.cpp via ctypes bindings #315
LostRuins
started this conversation in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Now that it's somewhat usable, I'd like to share a thing I made: https://github.com/LostRuins/llamacpp-for-kobold
This fork does 2 things mainly:
There is only one pressing flaw related to prompt processing latency, and that is #229 which hopefully someone can solve.
Beta Was this translation helpful? Give feedback.
All reactions