-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Azure Llama client support #872
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM but I'm not sure what's up with the failing tests.
|
||
azure_llama_key = os.getenv("CODEMODDER_AZURE_LLAMA_API_KEY") | ||
azure_llama_endpoint = os.getenv("CODEMODDER_AZURE_LLAMA_ENDPOINT") | ||
if bool(azure_llama_key) ^ bool(azure_llama_endpoint): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is my fault because I did this originally but !=
is definitely a lot clearer here 😅
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah I kinda agree but I also got very used to reading this and it's nice :)
I agree. I don't think there's any need for deprecation periods at this point, we should just remember to bump major versions. |
1184a6f
to
aa46f0e
Compare
Quality Gate passedIssues Measures |
Overview
Codemodder can run codemods with Azure LLama models
Description
context.llm_client > context.openai_llm_client
for clarityCODEMODDER_AZURE_LLAMA_ENDPOINT
I tested this out with
and got the expected response
Close #871