Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Call models with a prompt and output type directly #1359

Open
rlouf opened this issue Jan 3, 2025 · 1 comment · May be fixed by #1385
Open

Call models with a prompt and output type directly #1359

rlouf opened this issue Jan 3, 2025 · 1 comment · May be fixed by #1385
Milestone

Comments

@rlouf
Copy link
Member

rlouf commented Jan 3, 2025

In the current re-design of the API we added Generator objects that accept a model and an output type as parameters:

from outlines import Generator, models
from pydantic import BaseModel

class Foo(BaseModel):
    bar: str

model = models.openai("gpt-4o")
generator = Generator(model, Foo)
result = generator("prompt")

The rationale behind this design is that, for open source models, the index for structured generation can take some time to compile and we hold it in this object so it can be re-used. There are a few issues with this design:

  1. API models do not have this problem
  2. This adds an extra step in the happy pip install outlines -> text generated path
  3. It does not correspond to most people's mental model. Other libraries make you call the model instance with the prompt to generate text.

I suggest that we modify the original design to make simple use cases straightforward, but still allows flexibility in more complex cases. We can add a __call__ method to model instances so the previous workflow can be re-written as:

from outlines import Generator, models
from pydantic import BaseModel

class Foo(BaseModel):
    bar: str

model = models.openai("gpt-4o")
result = model("prompt", Foo)

While keeping the Generator pattern for cases where a user may want to call the same model repeatedly with the same structure:

from outlines import Generator, models
from pydantic import BaseModel

class Foo(BaseModel):
    bar: str

model = models.openai("gpt-4o")
generator = Generator(model, Foo)

result_1 = generator("prompt_1")
result_2 = generator("prompt_2")
@rlouf rlouf added this to the 1.0 milestone Jan 3, 2025
@RobinPicard
Copy link
Contributor

Your suggestion makes a lot of sense to me. I think that removing the generation creation step by having the possibility to call the model directly makes the basic example look a lot simpler and more intuitive to new users.

@RobinPicard RobinPicard linked a pull request Jan 20, 2025 that will close this issue
@rlouf rlouf linked a pull request Jan 20, 2025 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants