Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The agent reports an error using the gpt-4o-mini model #12881

Closed
5 tasks done
yunfeng13 opened this issue Jan 20, 2025 · 7 comments
Closed
5 tasks done

The agent reports an error using the gpt-4o-mini model #12881

yunfeng13 opened this issue Jan 20, 2025 · 7 comments
Assignees
Labels
🐞 bug Something isn't working

Comments

@yunfeng13
Copy link

Self Checks

  • This is only for bug report, if you would like to ask a question, please head to Discussions.
  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • Please do not modify this template :) and fill in all the required fields.

Dify version

0.15.0

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

Image

error detail:[openai] Bad Request Error, Error code: 400 - {'error': {'message': "Invalid value for 'content': expected a string, got null. (request id: 202501202012127715196675TbPifJ3) (request id: 20250120201212658215175TCApZ8tE) (request id: 2025012020121243955128724364135) (request id: 2025012020121234417090004257917) (request id: 20250120201212335003074lycgz9cp)", 'type': 'invalid_request_error', 'param': 'messages.[4].content', 'code': None}}
I found that once the agent adds context and retrieves it, it will report an error if asked again.

✔️ Expected Behavior

I hope to know the reason. If the context is not hit, it will not report an error. Once it hits, it will report an error.

❌ Actual Behavior

Image
error detail:[openai] Bad Request Error, Error code: 400 - {'error': {'message': "Invalid value for 'content': expected a string, got null. (request id: 202501202012127715196675TbPifJ3) (request id: 20250120201212658215175TCApZ8tE) (request id: 2025012020121243955128724364135) (request id: 2025012020121234417090004257917) (request id: 20250120201212335003074lycgz9cp)", 'type': 'invalid_request_error', 'param': 'messages.[4].content', 'code': None}}

@dosubot dosubot bot added the 🐞 bug Something isn't working label Jan 20, 2025
Copy link

dosubot bot commented Jan 20, 2025

I found a similar unsolved discussion titled Bug of Use Claude, which might be related to the issue you're experiencing [1].

The error you're encountering, "Invalid value for 'content': expected a string, got null," typically indicates that the 'content' field in the 'messages' object is not being populated correctly. This can happen if the context retrieval process fails to provide valid content, resulting in a null value being passed to the OpenAI API.

To address this issue, you might want to check the following:

  1. Context Retrieval Logic: Ensure that the context retrieval logic is correctly implemented and that it returns valid string content. If the context is not found, make sure there is a fallback mechanism to handle such cases without resulting in a null value.

  2. Error Handling: Implement error handling to catch and manage cases where the context retrieval returns null. This can prevent the null value from being included in the 'messages' object.

  3. Debugging: Add logging or debugging statements to trace the values being assigned to the 'content' field in the 'messages' object. This can help identify where the null value is being introduced.

These steps should help you identify and resolve the issue with the 'content' field in the 'messages' object.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@laipz8200
Copy link
Member

Hi @yunfeng13. Thank you for the report. Could you please provide your logs in the api Docker container? It will help us to locate this issue.

@yunfeng13
Copy link
Author

Image This is a probability report of 400, but the content of 400 is the error report in the screenshot above. There is only 400 in the container.

@laipz8200
Copy link
Member

That means some content fields your LLM API returned in the messages array have a None value, and we expect a string.

@yunfeng13
Copy link
Author

Do you mean that the api returns a null value? Will returning null cause llm to report an error?

@laipz8200
Copy link
Member

Yes, the path of the wrong result is messages[index].content, our data structure of this content expects a string, not None.

@yunfeng13
Copy link
Author

Thank you very much. Then I will try another apikey.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐞 bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants