Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: No module named 'resource' #7581

Open
JackLittleWolf opened this issue Jan 6, 2025 · 3 comments
Open

[Bug]: No module named 'resource' #7581

JackLittleWolf opened this issue Jan 6, 2025 · 3 comments
Labels
awaiting: user response bug Something isn't working

Comments

@JackLittleWolf
Copy link

What happened?

No module named 'resource'

Relevant log output

│ e:\miniconda3\envs\ai-agent_env\lib\site-packages\litellm\litellm_core_utils\exception_mapping_u │
│ tils.py:1872 in exception_type                                                                   │
│                                                                                                  │
│   1869 │   │   │   │   if hasattr(original_exception, "status_code"):                            │
│   1870 │   │   │   │   │   if original_exception.status_code == 0:                               │
│   1871 │   │   │   │   │   │   exception_mapping_worked = True                                   │
│ ❱ 1872 │   │   │   │   │   │   raise APIConnectionError(                                         │
│   1873 │   │   │   │   │   │   │   message=f"VLLMException - {original_exception.message}",      │
│   1874 │   │   │   │   │   │   │   llm_provider="vllm",                                          │
│   1875 │   │   │   │   │   │   │   model=model,                                                  │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
APIConnectionError: litellm.APIConnectionError: VLLMException - No module named 'resource'
Error in generating model output:
litellm.APIConnectionError: VLLMException - No module named 'resource'

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.53.1

Twitter / LinkedIn details

No response

@JackLittleWolf JackLittleWolf added the bug Something isn't working label Jan 6, 2025
@bhaveshAswani112
Copy link

@JackLittleWolf Can you please describe the issue in detail? How can i regenerate the issue?

@krrishdholakia
Copy link
Contributor

agreed with @bhaveshAswani112 seeing the full stacktrace here would be helpful

@JackLittleWolf
Copy link
Author

@bhaveshAswani112
Thank you so much!Here are my operational steps.

step1: conda create -n ai-manage_env python=3.10
step2: vllm startup[vllm serve /var/models/Meta-Llama-3.1-8B-Instruct --enable-auto-tool-choice --tool-call-parser hermes --gpu-memory-utilization 0 --trust-remote-code --tensor-parallel-size 1 --max-model-len 24464 --gpu-memory-utilization 0.99 --port 8000]
step3:
response = litellm.completion(
model="vllm/var/models/Meta-Llama-3.1-8B-Instruct",
messages='Which product has the highest price?',
api_base='http://localhost:8000',
api_key='vllm',
)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
awaiting: user response bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants