-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Raise an exception when get error HTTP code (#646) #647
feat: Raise an exception when get error HTTP code (#646) #647
Conversation
* (feat): add `LLMResponseHTTPError` in exceptions.py * (feat): raise `LLMResponseHTTPError` when HuggingFace LLM get HTTP code greater or equal to 400 (i. e. some error occurred)
* (fix): according changes in tests * (tests): add test case for handling error HTTP code
Codecov Report
❗ Your organization needs to install the Codecov GitHub app to enable full functionality. @@ Coverage Diff @@
## main #647 +/- ##
==========================================
- Coverage 84.29% 84.28% -0.01%
==========================================
Files 66 66
Lines 3062 3073 +11
==========================================
+ Hits 2581 2590 +9
- Misses 481 483 +2
📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
WalkthroughThis update introduces a new exception class Changes
TipsChat with CodeRabbit Bot (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Review Status
Actionable comments generated: 0
Files selected for processing (3)
- pandasai/exceptions.py (1 hunks)
- pandasai/llm/base.py (3 hunks)
- tests/llms/test_base_hf.py (4 hunks)
Additional comments (Suppressed): 8
pandasai/exceptions.py (1)
- 71-87: The new
LLMResponseHTTPError
exception class is well implemented. It takes astatus_code
and an optionalerror_msg
as arguments, and includes these details in the exception message. This will provide more specific error information when the remote service encounters an issue.However, the docstring for the class could be improved. It currently states that
Exception
is an argument, which is not accurate. Instead, it should describe thestatus_code
anderror_msg
arguments.- Args: - Exception (Exception): LLMResponseHTTPError + Args: + status_code (int): The HTTP status code returned by the remote server. + error_msg (str, optional): An optional error message. Defaults to None.tests/llms/test_base_hf.py (4)
3-7: The import of
LLMResponseHTTPError
is new and is used in the new testtest_query_http_error_401
. This is consistent with the PR summary.15-20: A new fixture
api_response_401
is introduced to simulate a 401 HTTP error response. This is used in the new testtest_query_http_error_401
.37-43: The
status_code
attribute is now being set on theresponse_mock
object. This is used in the new testtest_query_http_error_401
to simulate different HTTP status codes.57-79: A new test
test_query_http_error_401
is introduced. This test mocks therequests.post
method to return a 401 HTTP error response and checks if the new exceptionLLMResponseHTTPError
is correctly raised. This is consistent with the PR summary.pandasai/llm/base.py (3)
27-30: The new
LLMResponseHTTPError
exception is imported correctly. This exception will be used to handle HTTP error responses from the HuggingFace API.325-330: The docstring is updated to include the new
LLMResponseHTTPError
exception in theRaises:
section. This provides clear documentation on what exceptions this function might raise.337-348: The code now checks the HTTP status code of the response. If it's 400 or above, it raises an
LLMResponseHTTPError
with the status code and error message (if any). This is a good practice as it allows for more specific error handling in the calling code.- return response.json()[0]["generated_text"] + if response.status_code >= 400: + try: + error_msg = response.json().get("error") + except (requests.exceptions.JSONDecodeError, TypeError): + error_msg = None + + raise LLMResponseHTTPError( + status_code=response.status_code, error_msg=error_msg + ) + + return response.json()[0]["generated_text"]
@nautics889 thanks a lot for the improvement, merging! |
This PR aims to resolve the problem described in #646.
Add:
LLMResponseHTTPError
when an API forHuggingFaceLLM
responses with 400+ code (e. g. invalid token, a remote server in under maintenance, etc.)P. S. added the one only for
HuggingFaceLLM
class, because for others we have some different API calls (usingopenai
SDK package for OpenAI for example)LLMResponseHTTPError
in exceptions.pyLLMResponseHTTPError
when HuggingFace LLM get HTTP code greater or equal to 400 (i. e. some error occurred)LLMResponseHTTPError
when a remote provider responded with an error code #646Summary by CodeRabbit