Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add model_prompt config param for LLMBlock #141

Merged
merged 4 commits into from
Jul 16, 2024

Conversation

russellb
Copy link
Member

@russellb russellb commented Jul 15, 2024

Closes #136

2363dfe docs: Add a doc to track pipeline config version history
87871d3 llmblock: add model_prompt config parameter
4f042ae Add test case for LLMBlock model_prompt parameter
7045456 llmblock: allow model_prompt=""

commit 2363dfe
Author: Russell Bryant [email protected]
Date: Mon Jul 15 14:35:26 2024 -0400

docs: Add a doc to track pipeline config version history

Our intiial pipeline configuration is version `1.0`. We haven't
actually released anything yet, but I think it would be good to go
ahead and get in the habit of managing this verison number as we make
additions. Create a doc where we can keep a list of what changed at
each version bump.

Signed-off-by: Russell Bryant <[email protected]>

commit 87871d3
Author: Russell Bryant [email protected]
Date: Mon Jul 15 14:40:18 2024 -0400

llmblock: add `model_prompt` config parameter

We previously always set the model prompt based on the model family.
We have some cases where we'd like to override this default behavior
explicitly as part of pipeline configuration. This parameter will do
that.

Signed-off-by: Russell Bryant <[email protected]>

commit 4f042ae
Author: Russell Bryant [email protected]
Date: Mon Jul 15 14:57:32 2024 -0400

Add test case for LLMBlock model_prompt parameter

Add a test case that ensures that passing `model_prompt` as an empty
string results in `self.model_prompt` as empty. If this parameter is
not specified (is None), `self.model_prompt` should be a non-empty
string as a result of our default behavior of picking a prompt based
on the model family.

Signed-off-by: Russell Bryant <[email protected]>

commit 7045456
Author: Mark McLoughlin [email protected]
Date: Mon Jul 15 23:56:10 2024 +0100

llmblock: allow model_prompt=""

An anticipated use case for model_prompt is simply to disable
this additional prompt. Currently the pipeline author would
need to specify model_prompt="{prompt}" to achieve this.

Make this easier by allowing model_prompt="" to have this
meaning

Signed-off-by: Mark McLoughlin <[email protected]>

@mergify mergify bot added documentation Improvements or additions to documentation ci-failure labels Jul 15, 2024
@mergify mergify bot added testing Relates to testing and removed ci-failure labels Jul 15, 2024
@russellb russellb marked this pull request as ready for review July 15, 2024 21:01
@markmc
Copy link
Contributor

markmc commented Jul 15, 2024

I've added commit 2ff97ba to allow model_prompt: "" which is friendlier than model_prompt: "{prompt}"

docs/pipeline_config.md Outdated Show resolved Hide resolved
russellb and others added 4 commits July 15, 2024 19:52
Our intiial pipeline configuration is version `1.0`. We haven't
actually released anything yet, but I think it would be good to go
ahead and get in the habit of managing this verison number as we make
additions. Create a doc where we can keep a list of what changed at
each version bump.

Signed-off-by: Russell Bryant <[email protected]>
We previously always set the model prompt based on the model family.
We have some cases where we'd like to override this default behavior
explicitly as part of pipeline configuration. This parameter will do
that.

Signed-off-by: Russell Bryant <[email protected]>
Add a test case that ensures that passing `model_prompt` as an empty
string results in `self.model_prompt` as empty. If this parameter is
not specified (is None), `self.model_prompt` should be a non-empty
string as a result of our default behavior of picking a prompt based
on the model family.

Signed-off-by: Russell Bryant <[email protected]>
An anticipated use case for model_prompt is simply to disable
this additional prompt. Currently the pipeline author would
need to specify model_prompt="{prompt}" to achieve this.

Make this easier by allowing model_prompt="" to have this
meaning

Signed-off-by: Mark McLoughlin <[email protected]>
Copy link
Member Author

@russellb russellb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the changes @markmc added lgtm

@markmc markmc added this to the 0.1.3 milestone Jul 16, 2024
@markmc markmc merged commit eb78549 into instructlab:main Jul 16, 2024
13 checks passed

## Pipeline Configuration Schema

A schema for validating pipeline configuration can be found in [`src/instructlab/sdg/pipelines/schema/v1.json`](../src//instructlab/sdg/pipelines/schema/v1.json)
Copy link
Contributor

@derekhiggins derekhiggins Jul 16, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit:
double "/"'s in path name

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oops

jwm4 pushed a commit to jwm4/sdg that referenced this pull request Dec 13, 2024
…sign-doc

Add design doc for training to explain accelerate inclusion for FSDP
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation testing Relates to testing
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Support a "blank" model prompt in LLMBlock
3 participants