You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
when i try to use the proxy and call my mistral ai model through bedrock with a tools array filled it does not return response with tool_calls filled even though mistral is respondig. I was expecting tool_calls to be filled with get_weather.
{
"model": "bedrock/mistral-large",
"messages": [
{
"role": "user",
"content": "what is the weather in paris ?"
}
],
"tools": [
{
"type": "function",
"function": {
"name": "get_current_weather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA"
},
"unit": {
"type": "string",
"enum": ["celsius", "fahrenheit"]
}
},
"required": ["location"]
}
}
}
]
}
Here is the answer i receive from the proxy :
{
"id": "chatcmpl-65b8ffbd-4ee9-4c1f-94f6-00ac1038065b",
"created": 1735915807,
"model": "mistral.mistral-large-2402-v1:0",
"object": "chat.completion",
"system_fingerprint": null,
"choices": [
{
"finish_reason": "stop",
"index": 0,
"message": {
"content": "I don't have real-time data or browsing capabilities to provide current weather updates. However, you can check the weather in Paris by using a weather website or app, or by using a search engine and entering \"Paris weather\" as your query.",
"role": "assistant",
"tool_calls": null,
"function_call": null
}
}
],
"usage": {
"completion_tokens": 54,
"prompt_tokens": 15,
"total_tokens": 69,
"completion_tokens_details": null,
"prompt_tokens_details": null
}
}
Thank you for your help.
Relevant log output
No response
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.56.8
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered:
What happened?
Hello,
when i try to use the proxy and call my mistral ai model through bedrock with a tools array filled it does not return response with tool_calls filled even though mistral is respondig. I was expecting tool_calls to be filled with get_weather.
My config.yml
Here is my entry json :
Here is the answer i receive from the proxy :
Thank you for your help.
Relevant log output
No response
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.56.8
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: