-
-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
More verbosity and delayed retries #11
base: main
Are you sure you want to change the base?
Conversation
@@ -270,15 +283,16 @@ private function downloadV2Files(array $requests) | |||
|
|||
// got an outdated file, possibly fetched from a mirror which was not yet up to date, so retry after 2sec | |||
if ($is404 || $mtime < $userData['minimumFilemtime']) { | |||
if ($userData['retries'] > 2) { | |||
// 404s after 3 retries should be deemed to have really been deleted, so we stop retrying | |||
sleep($userData['retries']); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is not necessary IMO. It adds longer sleep but with the addition of 10 retries we anyway get 10x2sec sleep which should really be more than enough for everything to sync up upstream.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Without this exponentiel timeout time, even with 10 times, my mirror fails.
I don't know why, maybe an issue with my ISP, with CloudFlare or something like that ?
I retried with another ISP ; results :
git clone https://github.com/composer/mirror/
cat mirror.config.php
<?php
return [
// directory where metadata files will get saved
'target_dir' => './mymirror',
// user agent describing your mirror node, if possible include domain name of mirror, and a contact email address
'user_agent' => 'Just testing mirror script'/* TODO Mirror for foo.com ([email protected]) */,
// source repository URL
'repo_url' => 'https://repo.packagist.org',
// source repository hostname (optional, will guess from repo_url)
//'repo_hostname' => 'repo.packagist.org',
// source API URL
'api_url' => 'https://repo.packagist.org',
// how many times the script will run the mirroring step before exiting
'iterations' => 1,
// how many seconds to wait between mirror runs
'iteration_interval' => 5,
// set this to false if you do not run the --v1 mirror job, to ensure that the v2 will then take care of syncing packages.json
'has_v1_mirror' => false,
];
./mirror.php --v2 -v
[...] lots of M [...]
[...] plenty of EEEEEEE [...]
Fatal error: Uncaught Symfony\Component\HttpClient\Exception\TransportException: Timeout was reached for "https://repo.packagist.org/p2/muxtor/yii2-pkk5-component.json". in /home/steph/composer-mirror/vendor/symfony/http-client/Response/CurlResponse.php:317
Result : 3.6Gb
With my PR, with the same config, no EEEE:
[...] lots of M [...]
[1]R[1]R[1]R[1]R[1]R[1]R[1]R[1]R[2]R[2]R[2]R[2]R[2]R[2]R[2]R[2]R[3]R[3]R[3]R[3]R[3]R[3]R[3]R[3]R[4]R[4]R[4]R[4]R[4]R[4]R[4]R[4]R[5]R[5]R[5]R[5]R[5]R[5]R[5]R[5]R[6]R[6]R[6]R[6]R[6]R[6]R[6]R[6]R[7]R[7]R[7]R[7]R[7]R[7]R[7]R[7]R[8]R[8]R[8]R[8]R[8]R[8]R[8]R[8]R[9]R[9]R[9]R[9]R[9]R[9]R[9]R[9]R[10]R[10]R[10]R[10]R[10]R[10]R[10]R[10]????????
Result : 6.2Gb
This PR adds
For a reason that I can't explain when has_v1_mirror is true, some downloads fails, it seems that retrying immediately doesn't helps to recover everytime the things. Adding a fixed pause (i.e. : 1 second in commit 1066162) does not seems to be always sufficient. Calming more and more down the little 🐰 forces him to gather things much better ! 😜
Output show between between brackets show the number of retries.
time ./mirror --v2 -v
, from empty folder, started 2021-06-12T09:29PM UTC,Seen at the end of the process :
time ./mirror.php --resync -v
started at 2021-06-1210:16PMUTCSeen at the end of the process :
The mirror is fully operationnal