-
Notifications
You must be signed in to change notification settings - Fork 475
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
added a folder for v1-single-agent with modified codes to run with groq api and nomic embeddings with task specific embedding query. #22
base: main
Are you sure you want to change the base?
Conversation
…eddings with task specific embedding query.
added a folder with modified codes to run with groq api and nomic embeddings with task specific embedding query. Here is the readme details: Open source models with Groq api and nomic embeddings (local)If you do not have the hardware to run ollama with open source models, using groqs free tier is a good way to experiment. Additional Features
Prerequisites
|
Thank you for this PR @hasanulkarim! Could you please share your motivation behind implementing Groq instead of using OpenRouter when you want open source LLMs but not to run them locally? Is it just because of speed? I love Groq but also want to avoid adding too many different providers to this project since it makes the documentation and .env.example file longer! |
Hi Cole, yes, my main motivation was the speed of groq (for users who dont have access to a high end gpu for large LLMs). It also has a more generous rate limits on the free tier than OpenRouter afaik. Thought its a good option for anyone who wants to try the app for free with latest open source models just to get familiar with agent building. Agree with your points though that it can get out of hand if we start adding other providers like gemini etc., but, hopefully we can keep one free completely local settings (ollama with nomic) and one decent free tier with good speed (groq) to provide alternatives to openai. Thank you very much for looking into the PR and would appreciate if you have any feedback! Good luck with the project and thanks for your amazing youtube contents. |
thanks |
Makes sense, thanks @hasanulkarim! I plan on adding more providers in bulk in a future version of Archon so I will use this PR then! |
No description provided.