-
-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: v1.56.10 breaks get_model_info
for custom providers
#7575
Labels
Comments
thanks for the issue. will work on this, and add your example to our ci/cd to prevent future issues. |
able to repro |
krrishdholakia
added a commit
that referenced
this issue
Jan 7, 2025
krrishdholakia
added a commit
that referenced
this issue
Jan 7, 2025
Handle custom llm provider scenario Fixes https://github.com/ /issues/7575
krrishdholakia
added a commit
that referenced
this issue
Jan 7, 2025
* test(test_amazing_vertex_completion.py): fix test * test: initial working code gecko test * fix(vertex_ai_non_gemini.py): support vertex ai code gecko fake streaming Fixes #7360 * test(test_get_model_info.py): add test for getting custom provider model info Covers #7575 * fix(utils.py): fix get_provider_model_info check Handle custom llm provider scenario Fixes https://github.com/ /issues/7575
Closing as this is now fixed from v1.57.0+ |
rajatvig
pushed a commit
to rajatvig/litellm
that referenced
this issue
Jan 16, 2025
* test(test_amazing_vertex_completion.py): fix test * test: initial working code gecko test * fix(vertex_ai_non_gemini.py): support vertex ai code gecko fake streaming Fixes BerriAI#7360 * test(test_get_model_info.py): add test for getting custom provider model info Covers BerriAI#7575 * fix(utils.py): fix get_provider_model_info check Handle custom llm provider scenario Fixes https://github.com/ BerriAI/issues/7575
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
What happened?
With v1.56.9,
litellm.get_model_info
for custom providers works fine.With yesterday's v1.56.10 release,
litellm.get_model_info
is broken for custom providers. I suspect this is caused by the changes introduced in #7538.Minimal reproducible example adapted from the official docs on custom providers:
uvx --python 3.10 --with "litellm==1.56.9" ipython
uvx --python 3.10 --with "litellm==1.56.10" ipython
Relevant log output
Are you a ML Ops Team?
Yes
What LiteLLM version are you on ?
v1.56.10
Twitter / LinkedIn details
@laurentsorber
The text was updated successfully, but these errors were encountered: