[Bug]: When using DeepSeek in the [Ai Profive Configuration] option, the default configuration is incorrect: 'others/deepseek-chat' should be 'deepseek/deepseek-chat' .(Including Fix) #5978
Labels
bug
Something isn't working
Is there an existing issue for the same bug?
Describe the bug and reproduction steps
Step 1: After Docker starts fresh, access http://localhost:3000/ to configure the AI Provider.
Step 2: Select others/deepseek-chat.
Step 3: Configure the API Key.
Step 4: Configure the GitHub token and enter a code repository.
Step 5: Send any message.
Expected: Normal response.
Actual: The chat box reports an error as follows:
completion(model='huggingface/starcoder',..) Learn more: https://docs.litellm.ai/docs/providers
Fix Method:
Go to [Ai Profive Configuration] -> [Customer Mode], change others/deepseek-chat to deepseek/deepseek-chat, save, and restart the session.
Referrence: https://docs.litellm.ai/docs/providers/deepseek
OpenHands Installation
Docker command in README
OpenHands Version
all-hands-ai/openhands:0.17
Operating System
WSL on Windows
Logs, Errors, Screenshots, and Additional Context
Default Configuration which is wrong
Chat Error with default config other/deep
Fix Configuration according to https://docs.litellm.ai/docs/providers/deepseek
Chat works well after fixed config.
The text was updated successfully, but these errors were encountered: