Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] o1-preview: Unsupported parameter: 'stop' is not supported with this model #1908

Open
FoleyTim opened this issue Jan 16, 2025 · 2 comments
Labels
bug Something isn't working

Comments

@FoleyTim
Copy link

FoleyTim commented Jan 16, 2025

Description

Hi, I'm using an o1 preview model hosted on azure and I keep getting this error... giving the stop parameter None does not seem to work. Anyone have any ideas what could be going wrong? I've also tried not including the stop param and giving the stop param an empty array.

return LLM(
          api_key=config.api_key(),
          base_url=config.endpoint(),
          api_version=config.api_version(),
          azure=True,
          deployment_id=config.o1_deployment(),
          model=config.o1_model(),
          max_tokens=16384, 
          temperature=1,  
          top_p=1, 
          stop=None
      )

litellm.exceptions.BadRequestError: litellm.BadRequestError: AzureException BadRequestError - Error code: 400 - {'error': {'message': "Unsupported parameter: 'stop' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'stop', 'code': 'unsupported_parameter'}}

Steps to Reproduce

create an LLM with o1 preview model hosted on azure. Run a simple crew kickoff

Expected behavior

Should run like any other llm would and not give error

Screenshots/Code snippets

return LLM(
          api_key=config.api_key(),
          base_url=config.endpoint(),
          api_version=config.api_version(),
          azure=True,
          deployment_id=config.o1_deployment(),
          model=config.o1_model(),
          max_tokens=16384, 
          temperature=1,  
          top_p=1, 
          stop=None
      )

litellm.exceptions.BadRequestError: litellm.BadRequestError: AzureException BadRequestError - Error code: 400 - {'error': {'message': "Unsupported parameter: 'stop' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'stop', 'code': 'unsupported_parameter'}}

Operating System

macOS Sonoma

Python Version

3.12

crewAI Version

0.80.0 - 0.95.0

crewAI Tools Version

0.14.0

Virtual Environment

Venv

Evidence

litellm.exceptions.BadRequestError: litellm.BadRequestError: AzureException BadRequestError - Error code: 400 - {'error': {'message': "Unsupported parameter: 'stop' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'stop', 'code': 'unsupported_parameter'}}

Possible Solution

None

Additional context

There does not seem to be much info at all in the docs about this

@FoleyTim FoleyTim added the bug Something isn't working label Jan 16, 2025
@FoleyTim
Copy link
Author

I managed to fix this by monkey patching:

import litellm

original_completion = litellm.completion

def patched_completion(*args, **kwargs):
    if 'stop' in kwargs:
        print("Removing 'stop' parameter from LiteLLM call...")
        kwargs.pop('stop')
    return original_completion(*args, **kwargs)

litellm.completion = patched_completion

@sterling000
Copy link

I get the same error using Perplexity, any of it's models, since you saw it with o1, I think the issue is not dependent on the model at all but an issue as you saw with litellm.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants