You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Gateway currently does not allow hitting LLM which are under some other proxy because the base url for each provider is static in code. Many organisations or users have openai, anthropic, cohere, etc access under a proxy server. So instead of using api.openai.com, their request should be able to hit their custom proxy url to use that proxy. One of the main reasons of such proxy is to have a custom auth, spend limit, rate limit, etc. which organisations assign to their members instead of exposing the actual LLM key.
add a custom_host setting in configs where users can pass their proxy url and gateway will just add the endpoint to it.
add a x-portkey-custom-host header which does the same as config setting but it will be applied globally and will overwrite custom_host passed in config.
The text was updated successfully, but these errors were encountered:
Problem statement:
Gateway currently does not allow hitting LLM which are under some other proxy because the base url for each provider is static in code. Many organisations or users have openai, anthropic, cohere, etc access under a proxy server. So instead of using api.openai.com, their request should be able to hit their custom proxy url to use that proxy. One of the main reasons of such proxy is to have a custom auth, spend limit, rate limit, etc. which organisations assign to their members instead of exposing the actual LLM key.
Related issue: #124
Solution:
The text was updated successfully, but these errors were encountered: