Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Deepseek support for JSON response format #7580

Open
shadi-fsai opened this issue Jan 6, 2025 · 0 comments
Open

[Bug]: Deepseek support for JSON response format #7580

shadi-fsai opened this issue Jan 6, 2025 · 0 comments
Labels
bug Something isn't working

Comments

@shadi-fsai
Copy link

What happened?

A bug happened!
I am trying to get deepseek-chat to respond with a json format through litllm, getting:

litellm.exceptions.BadRequestError: litellm.BadRequestError: DeepseekException - Failed to deserialize the JSON body into the target type: response_format: response_format.type json_schema is unavailable now at line 1 column 1193

I'm using the following code to call deepseek through litellm:

class TopicSchema(BaseModel):
topic: str
subtopics: List[str]

class CurriculumSchema(BaseModel):
topics: List[TopicSchema]

prompt = settings.curriculum_prompt
response = completion(
model=settings.datagen_model,
messages=[
{"role": "system", "content": settings.teacher_role},
{"role": "user", "content": prompt},
],
response_format=CurriculumSchema,
temperature=0,
max_tokens=4096,
)

This code works with llama3.3, but not deepseek, although deepseek seems to have support for schemas:

https://api-docs.deepseek.com/guides/json_mode

Relevant log output

No response

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

1.57.0

Twitter / LinkedIn details

No response

@shadi-fsai shadi-fsai added the bug Something isn't working label Jan 6, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant