You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I'm using an o1 preview model hosted on azure and I keep getting this error... giving the stop parameter None does not seem to work. Anyone have any ideas what could be going wrong? I've also tried not including the stop param and giving the stop param an empty array.
I get the same error using Perplexity, any of it's models, since you saw it with o1, I think the issue is not dependent on the model at all but an issue as you saw with litellm.
Description
Hi, I'm using an o1 preview model hosted on azure and I keep getting this error... giving the stop parameter None does not seem to work. Anyone have any ideas what could be going wrong? I've also tried not including the stop param and giving the stop param an empty array.
litellm.exceptions.BadRequestError: litellm.BadRequestError: AzureException BadRequestError - Error code: 400 - {'error': {'message': "Unsupported parameter: 'stop' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'stop', 'code': 'unsupported_parameter'}}
Steps to Reproduce
create an LLM with o1 preview model hosted on azure. Run a simple crew kickoff
Expected behavior
Should run like any other llm would and not give error
Screenshots/Code snippets
litellm.exceptions.BadRequestError: litellm.BadRequestError: AzureException BadRequestError - Error code: 400 - {'error': {'message': "Unsupported parameter: 'stop' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'stop', 'code': 'unsupported_parameter'}}
Operating System
macOS Sonoma
Python Version
3.12
crewAI Version
0.80.0 - 0.95.0
crewAI Tools Version
0.14.0
Virtual Environment
Venv
Evidence
litellm.exceptions.BadRequestError: litellm.BadRequestError: AzureException BadRequestError - Error code: 400 - {'error': {'message': "Unsupported parameter: 'stop' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'stop', 'code': 'unsupported_parameter'}}
Possible Solution
None
Additional context
There does not seem to be much info at all in the docs about this
The text was updated successfully, but these errors were encountered: