Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using Outlines with vLLM/ FastChat #296

Closed
rounak610 opened this issue Sep 27, 2023 · 1 comment
Closed

Using Outlines with vLLM/ FastChat #296

rounak610 opened this issue Sep 27, 2023 · 1 comment
Labels

Comments

@rounak610
Copy link

How can we use outlines with vLLM or FastChat?

@brandonwillard
Copy link
Member

We already have an open issue for vLLM, #163, so feel free to follow and/or add to that.

Aside from that, exactly what kind of integrations with FastChat are you proposing? It looks like they already have some HF integrations, so using those models with outlines could be a straightforward user-level integration. If you can construct a minimal example demonstrating what you would like to be able to do, we can work from that.

@dottxt-ai dottxt-ai locked and limited conversation to collaborators Sep 27, 2023
@brandonwillard brandonwillard converted this issue into discussion #297 Sep 27, 2023

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
Projects
None yet
Development

No branches or pull requests

2 participants