-
Notifications
You must be signed in to change notification settings - Fork 814
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[R-304] support more providers for is_finished()
logic
#1548
Comments
is_finished()
logicis_finished()
logic
Llama models hosted on TogetherAI use
|
hey @ahgraber thanks for reporting this as always 🙂 |
While you're at it, IBM watsonx LLMs give
|
@spackows not sure if you are still waiting for an update or if you already figured it out, but I found this solution and just added to it the 'eos_token' |
hey @malikaltakrori would you be interested in adding that as a PR? would really appreciate it - if not I'll make it 🙂 |
Hi @jjmachan, |
@jjmachan Done! |
Suggest improvements to the parser for figuring out if the model finish reasons so that the default parser can be improved
you can also define your own custom parser and parse it too
R-304
The text was updated successfully, but these errors were encountered: