Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: API Key not being passed to Langchain Client for LiteLLM #7573

Open
ishaan812 opened this issue Jan 5, 2025 · 0 comments
Open

[Bug]: API Key not being passed to Langchain Client for LiteLLM #7573

ishaan812 opened this issue Jan 5, 2025 · 0 comments
Labels
bug Something isn't working

Comments

@ishaan812
Copy link

What happened?

API Key not getting detected in client even though i'm passing it explicitly

import litellm
from agents.constants.tenant_maps import TenantOpenAIKeyMap
from langchain_community.chat_models import ChatLiteLLM


def LiteLLMWrapper(tenantId: str, model: str, json_flag: bool = False):
    provider, modelName = model.split("/")
    litellm.set_verbose= True
    api_key = TenantOpenAIKeyMap[tenantId]
    if provider == "openai":
        if api_key:
            if json_flag == True:
                chat = ChatLiteLLM(model=modelName, model_kwargs={"response_format": {"type": "json_object"}}, api_key=api_key, openai_api_key=api_key)
            else:
                chat = ChatLiteLLM(model=modelName, api_key=api_key, openai_api_key=api_key)
        else:
            raise Exception("No API key found for tenant")  
    return chat

Relevant log output

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ishaan812/Code/Niti-Workspace/genai-backend/venv/lib/python3.12/site-packages/litellm/main.py", line 2958, in completion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File "/Users/ishaan812/Code/Niti-Workspace/genai-backend/venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2189, in exception_type
    raise e
  File "/Users/ishaan812/Code/Niti-Workspace/genai-backend/venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 355, in exception_type
    raise AuthenticationError(
litellm.exceptions.AuthenticationError: litellm.AuthenticationError: AuthenticationError: OpenAIException - The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

1.56.10

Twitter / LinkedIn details

@ishaan812

@ishaan812 ishaan812 added the bug Something isn't working label Jan 5, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant