-
Notifications
You must be signed in to change notification settings - Fork 815
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Updated testset generation question prompts to use JSON formatting instructions + prompt tweaks for smaller LLMs #1354
Conversation
…ith smaller and chattier LLMs
…nk prompts to use JSON formatting instructions. Also fixed incorrect index in one of the examples of find_relevant_contexts prompt (this will make it more explicit that the index is 1-based).
Sure thing, I can take a look to see if I can apply these fixes on the new experimental testset generation, if they are still relevant of course. |
@shahules786 can you take a look? |
Hey @fschuh thanks for pushing the PR. I see the changes in prompts, but the changes in prompts are mostly always specific to the model you're using. We understand it was hard to change prompts of Ragas metrics, and testgen components. From v0.2 we will introduce |
I think we can try to augment json parser which is able to parse the json in markdown or anything else. |
@KylinMountain that's a good idea. I also have been thinking about it. The Prompt object should ideally be able to convert and parse prompts to any format like markdown, XML,etc |
glad to hear that. |
@shahules786 are we merging this in or should we create a new issue to track this problem? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@jjmachan No.
@fschuh If you're are happy with the explanation of why this is not mergeable, please considering closing the PR.
Hey @fschuh thanks for pushing the PR. I see the changes in prompts, but the changes in prompts are mostly always specific to the model you're using. We understand it was hard to change prompts of Ragas metrics, and testgen components. From v0.2 we will introduce
set_prompts
andget_prompts
interfaces for all the components so that you can modify prompts without installing ragas from the source and then raising a PR. Prompts are fundamentally just like hyperparameters for these algorithms.
@jjmachan this mixin solution seems to allow full control of the prompts without module reload hacks or modifying the Ragas code, which is great. |
yes @fschuh will be out on monday - just adding the final touches 🙂 |
-> This PR is mostly targeted at making the Ragas testset generation prompts work with smaller LLMs that can be run locally.
Some of the testset generation prompts are currently not making use of the JSON format instructions that were added to the eval prompts not too long ago. This PR fixes that.
Also, a few other notable fixes in this PR: