We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No module named 'resource'
│ e:\miniconda3\envs\ai-agent_env\lib\site-packages\litellm\litellm_core_utils\exception_mapping_u │ │ tils.py:1872 in exception_type │ │ │ │ 1869 │ │ │ │ if hasattr(original_exception, "status_code"): │ │ 1870 │ │ │ │ │ if original_exception.status_code == 0: │ │ 1871 │ │ │ │ │ │ exception_mapping_worked = True │ │ ❱ 1872 │ │ │ │ │ │ raise APIConnectionError( │ │ 1873 │ │ │ │ │ │ │ message=f"VLLMException - {original_exception.message}", │ │ 1874 │ │ │ │ │ │ │ llm_provider="vllm", │ │ 1875 │ │ │ │ │ │ │ model=model, │ ╰──────────────────────────────────────────────────────────────────────────────────────────────────╯ APIConnectionError: litellm.APIConnectionError: VLLMException - No module named 'resource' Error in generating model output: litellm.APIConnectionError: VLLMException - No module named 'resource'
No
v1.53.1
No response
The text was updated successfully, but these errors were encountered:
@JackLittleWolf Can you please describe the issue in detail? How can i regenerate the issue?
Sorry, something went wrong.
agreed with @bhaveshAswani112 seeing the full stacktrace here would be helpful
@bhaveshAswani112 Thank you so much!Here are my operational steps.
step1: conda create -n ai-manage_env python=3.10 step2: vllm startup[vllm serve /var/models/Meta-Llama-3.1-8B-Instruct --enable-auto-tool-choice --tool-call-parser hermes --gpu-memory-utilization 0 --trust-remote-code --tensor-parallel-size 1 --max-model-len 24464 --gpu-memory-utilization 0.99 --port 8000] step3: response = litellm.completion( model="vllm/var/models/Meta-Llama-3.1-8B-Instruct", messages='Which product has the highest price?', api_base='http://localhost:8000', api_key='vllm', )
No branches or pull requests
What happened?
No module named 'resource'
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.53.1
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: