-
Notifications
You must be signed in to change notification settings - Fork 30
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Inference] Fix auth token and add models starcoder and llama2 #39
Conversation
* add num_to_keep for pretrainers Signed-off-by: Zhi Lin <[email protected]> * add num_to_keep to config Signed-off-by: Zhi Lin <[email protected]> --------- Signed-off-by: Zhi Lin <[email protected]>
Why is env |
Please also help add |
IIUC, maybe we should add like L34 instead of dict() right? |
OK. Thanks. |
Signed-off-by: Yizhong Zhang <[email protected]>
…into add_starcoder
Signed-off-by: Yizhong Zhang <[email protected]>
Gentle ping @KepingYan for another review since all CI passed. |
inference/models/starcoder.yaml
Outdated
bot_id: '' | ||
stop_words: [] | ||
config: | ||
use_auth_token: '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' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
use_auth_token
cannot be written directly in config yaml, it needs to be set in CI file. @jiafuzha please help confirm this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, strictly speaking. we cannot. But it's only read-only key. If it can pass github security check, I think we can leave it for now.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If the environment of our CI nodes have env.HF_ACCESS_TOKEN
configured. I think I can try to get use_auth_token
runtime from env instead of passing in yaml directly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, it's better.
Maybe we can have a unit test later to verify if use_auth_token is passed correctly. This ticket exposed and fixed the token bug in several places, which shows the right value of CI.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, removed them from CI. Let's merge this first and I will create a follow-up PR to hide auth token.
Signed-off-by: Yizhong Zhang <[email protected]>
…into add_starcoder
All tests passed, auth token is removed from yaml. Removed |
how is auth_token passed to CI? huggingface-cli login? |
I used the environment variables llm-on-ray/.github/workflows/workflow_inference.yml Lines 112 to 130 in a3be1cd
|
Hi @KepingYan @jiafuzha , could you take a second look whether there are further comments? |
llm-on-ray/.github/workflows/workflow_finetune.yml Lines 66 to 67 in 6e32361
|
Thanks @KepingYan . It seems no env file on our CI nodes. |
All test passed, removed |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
No description provided.