Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(providers): Groq now uses LiteLLM openai-compat #1303

Merged
merged 1 commit into from
Feb 27, 2025
Merged

feat(providers): Groq now uses LiteLLM openai-compat #1303

merged 1 commit into from
Feb 27, 2025

Conversation

ashwinb
Copy link
Contributor

@ashwinb ashwinb commented Feb 27, 2025

Groq has never supported raw completions anyhow. So this makes it easier to switch it to LiteLLM. All our test suite passes.

I also updated all the openai-compat providers so they work with api keys passed from headers. provider_data

Test Plan

LLAMA_STACK_CONFIG=groq \
   pytest -s -v tests/client-sdk/inference/test_text_inference.py \
   --inference-model=groq/llama-3.3-70b-versatile --vision-inference-model=""

Also tested (openai, anthropic, gemini) providers. No regressions.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Meta Open Source bot. label Feb 27, 2025
@ashwinb ashwinb merged commit 928a39d into main Feb 27, 2025
7 checks passed
@ashwinb ashwinb deleted the groq branch February 27, 2025 21:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Meta Open Source bot.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants