Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

added model kwargs support to AzureChAzureMLChatOnlineEndpoint #313

Conversation

Manikanta5112
Copy link

open source chat models like llama2, has to instantiate through AzureMLChatOnlineEndpoint(azure_ml_endpoint)

currently, we are directly checking temperature parameter for self.llm. whereas models instantiated through (azure_ml_endpoint) doesn't contain that parameter.

whereas it exist as dictionary under model_kwargs.

so, added support for this.

@jjmachan
Copy link
Member

hey @Manikanta5112 thank you so much for putting this PR in! Could you fix the type error that is there?

@Manikanta5112
Copy link
Author

Hey @jjmachan, fixed the type error. pls check now

@jjmachan
Copy link
Member

jjmachan commented Nov 3, 2024

this has been improved with v0.2, @Manikanta5112 did you get a chance to check it out 🙂 ?
closing this for now but I'm really sorry we couldn't merge it 🙁 but at the same time thanks a million for taking the time to raise this, really grateful too and do checkout this form https://docs.google.com/forms/d/e/1FAIpQLSdM9FrrZrnpByG4XxuTbcAB-zn-Z7i_a7CsMkgBVOWQjRJckg/viewform - our way of saying thank you 🙂

@jjmachan jjmachan closed this Nov 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants