Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

More modular interface? #29

Closed
simon-mo opened this issue Nov 21, 2023 · 1 comment
Closed

More modular interface? #29

simon-mo opened this issue Nov 21, 2023 · 1 comment

Comments

@simon-mo
Copy link

Is it possible to have some sort of interface that's not lm-format-enforcer wrapping vLLM, rather, users can directly install both library and configure it using SamplingParams?

The pseudocode in my mind:

SamplingParams(
    logits_processors=[lmformatenforcer.JSONSchema(MyPydanticClass)],
    max_tokens=16,
)

Here's a similar proposal to Outlines: dottxt-ai/outlines#163 (comment)

@noamgat
Copy link
Owner

noamgat commented Nov 21, 2023

the logits processor needs the tokenizer to prepare itself. To make an API like this possible, the logitsprocessor interface would have to be more than a Callable, but a class with some init(tokenizer) functionality. This is inconsistent with how other inference libraries open logits processing APIs, which is why I chose this approach with vLLM.

Its technically possible, but would require changes to vLLM's SamplingParams, and I actually think the current interface is the right one.

@noamgat noamgat closed this as completed Dec 6, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants