-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix: 429 Too Many Requests #45
base: main
Are you sure you want to change the base?
Conversation
Signed-off-by: fortishield <[email protected]>
🚨 gitStream Monthly Automation Limit Reached 🚨 Your organization has exceeded the number of pull requests allowed for automation with gitStream. To continue automating your PR workflows and unlock additional features, please contact LinearB. |
Reviewer's Guide by SourceryThis pull request enhances the crawler's resilience to '429 Too Many Requests' errors by implementing more aggressive retry logic. It reduces the retry wait times for the main crawler client and introduces a dedicated retryable HTTP client with exponential backoff for individual GET requests. Updated class diagram for CrawlerclassDiagram
class Crawler {
-http: retryablehttp.Client
+NewCrawler(opt Option) Crawler
+httpGet(ctx context.Context, url string) (*http.Response, error)
}
class retryablehttp.Client {
-RetryMax: int
-RetryWaitMin: Duration
-RetryWaitMax: Duration
-Backoff: Backoff
+Do(req *http.Request) (*http.Response, error)
}
Crawler -- retryablehttp.Client : uses
File-Level Changes
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
PR Reviewer Guide 🔍Here are some key observations to aid the review process:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @FortiShield - I've reviewed your changes - here's some feedback:
Overall Comments:
- The retry logic seems to be duplicated; consider consolidating it into a single place.
- Consider adding a comment explaining why the retry parameters were changed.
Here's what I looked at during the review
- 🟢 General issues: all looks good
- 🟢 Security: all looks good
- 🟢 Testing: all looks good
- 🟢 Complexity: all looks good
- 🟢 Documentation: all looks good
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
PR Code Suggestions ✨Explore these optional code suggestions:
|
Addresses issue: #
Changes proposed in this pull request:
Summary by Sourcery
Improves the crawler's resilience to "429 Too Many Requests" errors by reducing the minimum and maximum retry wait times, and using the default exponential backoff strategy. Additionally, configures a retryable HTTP client with exponential backoff for handling HTTP requests.