Replies: 3 comments 1 reply
-
We support a lot of different providers: Openai-compatible providers (like vllm) are supported, among others, via litellm |
Beta Was this translation helpful? Give feedback.
0 replies
-
@yamijuan Did you figure out how to use openrouter? |
Beta Was this translation helpful? Give feedback.
1 reply
-
if open-router (which i have not used) are providing openai-compatible mechanism, that it is fully supported with pr-agent something like:
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Is there any way or plan to add open router as model provider or be able to set up a custom model provider that's openapi compatible? thanks
Beta Was this translation helpful? Give feedback.
All reactions