-
Notifications
You must be signed in to change notification settings - Fork 508
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Provider] Added Support for MonsterAPI Integration #280
Conversation
Hey @DheerajQblocks . Thanks for the PR! Can you please update this with latest main branch changes. Its missing a lot of major changes that were pushed last week. And the integration does not seem to follow OpenAI transform convention that we follow for other providers. All the response are still in monster API format which is not correct. Please let me know if I am missing something or if you need anymore details on this. |
Hi @VisargD , Thank you for reviewing the PR. I appreciate your feedback. I have updated the PR with the latest changes from the main branch and ensured that it aligns with the OpenAI transform convention as per our standards for other providers. The integration now follows the required format for responses, adhering to the guidelines. Regarding the response format, I've revised the implementation to conform to the expected structure for MonsterAPI, ensuring that it meets the requirements outlined in the project specifications. Please review the changes and let me know if there are any further adjustments needed. I'm here to address any additional concerns or questions you may have. Thank you for your attention to this matter. Best regards, |
package-lock.json
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you please remove these changes to package-lock.json? Seems to be unrelated to the PR and are also causing npm install issues.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
@VisargD Please review / Approve this PR. |
@Dheeraj-Bhandari - Is this correct doc for the API that you have integrated? https://developer.monsterapi.ai/reference/generate_v1_generate_post |
Yes it's correct |
@VisargD Please Take a Look |
Hey @DheerajQblocks - Sorry for the delay here. I will start the review today in sometime. |
src/providers/monsterapi/index.ts
Outdated
api: MonsterAPIApiConfig, | ||
generate: MonsterAPIChatCompleteConfig, | ||
responseTransforms: { | ||
generate: MonsterAPIChatCompleteResponseTransform, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of generate
, the name should be chatComplete
for this.
src/providers/monsterapi/generate.ts
Outdated
if ('response' in response) { | ||
return { | ||
id: '', | ||
object: '', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The object
value can be chat.completion
as we are following OpenAI specs
src/providers/monsterapi/generate.ts
Outdated
return { | ||
id: '', | ||
object: '', | ||
created: 0, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For created
, you can use Math.floor(Date.now() / 1000)
as value.
src/providers/monsterapi/generate.ts
Outdated
|
||
if ('response' in response) { | ||
return { | ||
id: '', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To have the same response structure across, you can generate a random id and send it in response so that none of the existing SDKs break. We currently use Date.now().toString()
in some providers as response id.
src/providers/monsterapi/generate.ts
Outdated
generateInvalidProviderResponseError, | ||
} from '../utils'; | ||
|
||
export const MonsterAPIChatCompleteConfig: ProviderConfig = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As per the API docs, some of the mappings are missing here. Can you please check this once?
If any param is similar to an existing OpenAI param, you can use the OpenAI param name as input and map it to Monster's param. If there is no OpenAI compatible param, then you can add them as a new mapping as it is.
src/providers/monsterapi/generate.ts
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As a part of naming convention that we follow in gateway, please use chatComplete instead of generate.
Hey @Dheeraj-Bhandari - Just checking up on this. Please let me know if there any blockers. |
Sure I'm on with the required changes I will update you once it done |
@VisargD Added the Required changes as per the suggestion provided by you. Please review/merge this PR ASAP. |
src/providers/monsterapi/api.ts
Outdated
}, | ||
getEndpoint: ({ fn }) => { | ||
switch (fn) { | ||
case 'generate': |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This case should also use chatComplete instead of generate. And the default should return an empty string. Basically this:
switch (fn) {
case 'chatComplete':
return '/generate';
default:
return '';
}
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
min: 0, | ||
max: 1, | ||
}, | ||
temp: { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The openAI compatible name is temperature and not temp. So the input will be temperature which can then be mapped to MonsterAPI specific param name
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can remove the comments from the file as they are not required and they are a part of github PR review so they should not be a part of the actual code.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
@VisargD Required changes made. Please review PR |
Looks good to me. There is one problem that I found. I am not sure if its expected. The API error response structure that you have used in the chatComplete transformer is not aligned with monster API docs. The docs shows 2 types of error response:
But the code that is written in this PR does not handle any of the above. |
Hey @DheerajQblocks - Can you please check the last comment and the merge conflicts. Everything else looks good to me. |
Sure |
Hi @VisargD |
@VisargD I have resolved the conflict. Please Review / Merge it on periority. |
Description:
This PR adds support for the MonsterAPI provider to our project. It includes the necessary configurations, request/response structures, and transformation logic to integrate MonsterAPI into our project.
Changes:
provider/index.ts
.types.ts
.provider/transform.ts
.provider/transform.ts
.Additional Information:
https://llm.monsterapi.ai/v1/generate
.Testing:
Screenshots:
Related Issue: #279 (Add MonsterAPI provider)