Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Raise an exception when get error HTTP code (#646) #647

Merged
merged 2 commits into from
Oct 17, 2023

Conversation

nautics889
Copy link
Contributor

@nautics889 nautics889 commented Oct 14, 2023

This PR aims to resolve the problem described in #646.

Add:

  • Raising LLMResponseHTTPError when an API for HuggingFaceLLM responses with 400+ code (e. g. invalid token, a remote server in under maintenance, etc.)
  • According tests to check if the exception is raised

P. S. added the one only for HuggingFaceLLM class, because for others we have some different API calls (using openai SDK package for OpenAI for example)


  • (feat): add LLMResponseHTTPError in exceptions.py
  • (feat): raise LLMResponseHTTPError when HuggingFace LLM get HTTP code greater or equal to 400 (i. e. some error occurred)

Summary by CodeRabbit

  • New Feature: Introduced a new error handling mechanism for our software. Now, when our system interacts with external services and encounters an issue, it will provide more detailed feedback. This will help us diagnose and resolve issues more efficiently.
  • Test: Added new tests to ensure this error handling works as expected. This will help maintain the reliability of our system and ensure a smooth user experience.

* (feat): add `LLMResponseHTTPError` in exceptions.py
* (feat): raise `LLMResponseHTTPError` when HuggingFace LLM get HTTP
  code greater or equal to 400 (i. e. some error occurred)
* (fix): according changes in tests
* (tests): add test case for handling error HTTP code
@codecov-commenter
Copy link

Codecov Report

Merging #647 (eaa00c3) into main (f9facf3) will decrease coverage by 0.01%.
The diff coverage is 81.81%.

❗ Your organization needs to install the Codecov GitHub app to enable full functionality.

@@            Coverage Diff             @@
##             main     #647      +/-   ##
==========================================
- Coverage   84.29%   84.28%   -0.01%     
==========================================
  Files          66       66              
  Lines        3062     3073      +11     
==========================================
+ Hits         2581     2590       +9     
- Misses        481      483       +2     
Files Coverage Δ
pandasai/exceptions.py 90.90% <100.00%> (+2.67%) ⬆️
pandasai/llm/base.py 88.02% <66.66%> (-0.95%) ⬇️

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@nautics889 nautics889 marked this pull request as ready for review October 14, 2023 21:33
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Oct 14, 2023

Walkthrough

This update introduces a new exception class LLMResponseHTTPError to handle HTTP error responses from a remote LLM service. The query method in llm/base.py has been updated to raise this exception for HTTP status codes >= 400. Corresponding tests have been added to ensure the correct handling of these exceptions.

Changes

File Summary
pandasai/exceptions.py Introduced a new exception class LLMResponseHTTPError for handling HTTP error responses from the LLM service.
pandasai/llm/base.py Updated the query method to raise LLMResponseHTTPError for HTTP status codes >= 400.
tests/llms/test_base_hf.py Added tests for the new exception handling, including a fixture for a 401 HTTP error response and a test method test_query_http_error_401.

🐇

In the land of code, where the shadows lie,

An exception rose, to catch an error nigh.

With a status code, and an optional message,

It guards our service, like a sturdy hedge.

Tests are written, to ensure it's right,

Now our code is stronger, ready for the fight. 🛡️


Tips

Chat with CodeRabbit Bot (@coderabbitai)

  • Mention @coderabbitai in any review comment for bot assistance.
  • Note: Review comments are made on code diffs or files, not on the PR overview.

Pause Incremental Reviews

  • Insert @coderabbitai: ignore in the PR description to halt the bot's ongoing reviews. Remove the line to resume.
  • Additionally, enter @coderabbitai pause to halt the bot's ongoing reviews as any issue or review comment. To
    resume reviews, enter @coderabbitai resume as any issue or review comment.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Status

Actionable comments generated: 0

Commits Files that changed from the base of the PR and between f9facf3 and eaa00c3.
Files selected for processing (3)
  • pandasai/exceptions.py (1 hunks)
  • pandasai/llm/base.py (3 hunks)
  • tests/llms/test_base_hf.py (4 hunks)
Additional comments (Suppressed): 8
pandasai/exceptions.py (1)
  • 71-87: The new LLMResponseHTTPError exception class is well implemented. It takes a status_code and an optional error_msg as arguments, and includes these details in the exception message. This will provide more specific error information when the remote service encounters an issue.

However, the docstring for the class could be improved. It currently states that Exception is an argument, which is not accurate. Instead, it should describe the status_code and error_msg arguments.

-    Args:
-        Exception (Exception): LLMResponseHTTPError
+    Args:
+        status_code (int): The HTTP status code returned by the remote server.
+        error_msg (str, optional): An optional error message. Defaults to None.
tests/llms/test_base_hf.py (4)
  • 3-7: The import of LLMResponseHTTPError is new and is used in the new test test_query_http_error_401. This is consistent with the PR summary.

  • 15-20: A new fixture api_response_401 is introduced to simulate a 401 HTTP error response. This is used in the new test test_query_http_error_401.

  • 37-43: The status_code attribute is now being set on the response_mock object. This is used in the new test test_query_http_error_401 to simulate different HTTP status codes.

  • 57-79: A new test test_query_http_error_401 is introduced. This test mocks the requests.post method to return a 401 HTTP error response and checks if the new exception LLMResponseHTTPError is correctly raised. This is consistent with the PR summary.

pandasai/llm/base.py (3)
  • 27-30: The new LLMResponseHTTPError exception is imported correctly. This exception will be used to handle HTTP error responses from the HuggingFace API.

  • 325-330: The docstring is updated to include the new LLMResponseHTTPError exception in the Raises: section. This provides clear documentation on what exceptions this function might raise.

  • 337-348: The code now checks the HTTP status code of the response. If it's 400 or above, it raises an LLMResponseHTTPError with the status code and error message (if any). This is a good practice as it allows for more specific error handling in the calling code.

-         return response.json()[0]["generated_text"]
+         if response.status_code >= 400:
+             try:
+                 error_msg = response.json().get("error")
+             except (requests.exceptions.JSONDecodeError, TypeError):
+                 error_msg = None
+ 
+             raise LLMResponseHTTPError(
+                 status_code=response.status_code, error_msg=error_msg
+             )
+ 
+         return response.json()[0]["generated_text"]

@nautics889 nautics889 changed the title feat: raise exc when get error HTTP code (#646) Feat: Raise an exception when get error HTTP code (#646) Oct 14, 2023
@gventuri
Copy link
Collaborator

@nautics889 thanks a lot for the improvement, merging!

@gventuri gventuri changed the title Feat: Raise an exception when get error HTTP code (#646) feat: Raise an exception when get error HTTP code (#646) Oct 17, 2023
@gventuri gventuri merged commit 16e5241 into Sinaptik-AI:main Oct 17, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Raising LLMResponseHTTPError when a remote provider responded with an error code
3 participants