Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Proposition integration LlamaIndex #380

Closed

Conversation

RobinPicard
Copy link
Contributor

The aim of this PR is not to be merged but just to propose a potential way of integrating outlines with LlamaIndex (issue: #345)

How to use LlamaIndex in outlines is not that straightforward as the way LlamaIndex is built is not fully suited for our use case. To avoid having to replicate too much of LlamaIndex's logic or interacting with its objects in a way that feels too hacky, I thought that the easiest would be to create a custom version of their LLM object specific to outlines that would be called when LlamaIndex would otherwise call the LLM normally chosen by the user.

The idea is that by calling the query method of the LlamaIndex engine as a regular user would, we let LlamaIndex figure out what context elements to add to the original prompt (including how many calls to make in case of a more sophisticated type of response synthesiser). Then, the initial outlines function the user has selected is called downstream of the LlamaIndex process through our custom LLM object with the modified prompt. It returns the expected LlamaIndex CompletionResponse object and then LlamaIndex decides whether to make some more queries or just to return the response based on the context configuration.

There are definitely many possible configuration arguments that are not covered here and I'm not sure it would already work with more complex cases but the basic example included seems to be working as intended. I'd be curious to know whether you think it looks like a promising direction to take

@rlouf
Copy link
Member

rlouf commented Nov 20, 2023

Thanks for opening a PR! My wording was confusing, what I meant was opening a PR on the LlamaIndex repo to add Outlines as a model provider 😬

@rlouf rlouf marked this pull request as draft November 20, 2023 07:11
@RobinPicard
Copy link
Contributor Author

Ahhh 😅 I can try but I'm not sure I'll manage to convince them to make the necessary change to accommodate outlines on top of just adding a new model provider

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants