Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add custom_host support to allow proxies #216

Closed
VisargD opened this issue Feb 17, 2024 · 0 comments · Fixed by #194
Closed

Add custom_host support to allow proxies #216

VisargD opened this issue Feb 17, 2024 · 0 comments · Fixed by #194
Assignees
Labels
enhancement New feature or request triage

Comments

@VisargD
Copy link
Collaborator

VisargD commented Feb 17, 2024

Problem statement:

Gateway currently does not allow hitting LLM which are under some other proxy because the base url for each provider is static in code. Many organisations or users have openai, anthropic, cohere, etc access under a proxy server. So instead of using api.openai.com, their request should be able to hit their custom proxy url to use that proxy. One of the main reasons of such proxy is to have a custom auth, spend limit, rate limit, etc. which organisations assign to their members instead of exposing the actual LLM key.

Related issue: #124

Solution:

  • add a custom_host setting in configs where users can pass their proxy url and gateway will just add the endpoint to it.
  • add a x-portkey-custom-host header which does the same as config setting but it will be applied globally and will overwrite custom_host passed in config.
@VisargD VisargD added the enhancement New feature or request label Feb 17, 2024
@VisargD VisargD self-assigned this Feb 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request triage
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant