You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have used the vemonet/setup-spark@v1 to install and set up Spark v3.3.1 in GitHub Actions QA workflow for over a year. All of a sudden, my QA started to fail with the following error
/usr/bin/docker exec 4bb9752fac4c215c6696cf43e736fffcd5ce38bf7815d1d879970ed52e53fcfc sh -c "cat /etc/*release | grep ^ID"
18:11:04 - Spark will be installed to /__w/project
18:11:04 - Downloading Spark binary from https://archive.apache.org/dist/spark/spark-3.3.1/spark-3.3.1-bin-hadoop2.tgz to /__w/project
Request timeout: /dist/spark/spark-3.3.1/spark-3.3.1-bin-hadoop2.tgz
Waiting 20 seconds before trying again
Request timeout: /dist/spark/spark-3.3.1/spark-3.3.1-bin-hadoop2.tgz
Waiting [16](https://github.com/xx/xx/actions/runs/12332032960/job/34419370515#step:6:17) seconds before trying again
18:11:55 - Issue installing Spark: check if the Spark version and Hadoop versions you are using are part of the ones proposed on the Spark download page at https://spark.apache.org/downloads.html
Error: Error: Request timeout: /dist/spark/spark-3.3.1/spark-3.3.1-bin-hadoop2.tgz
Error: Request timeout: /dist/spark/spark-3.3.1/spark-3.3.1-bin-hadoop2.tgz
Even though the GitHub runners cannot download the Spark binary from the URL above, I can manually access it. Please let me know if you have any suggestions to solve this issue. Note that I cannot use different Spark versions at this point. Thanks!
The text was updated successfully, but these errors were encountered:
Hello,
I have used the vemonet/setup-spark@v1 to install and set up Spark v3.3.1 in GitHub Actions QA workflow for over a year. All of a sudden, my QA started to fail with the following error
Even though the GitHub runners cannot download the Spark binary from the URL above, I can manually access it. Please let me know if you have any suggestions to solve this issue. Note that I cannot use different Spark versions at this point. Thanks!
The text was updated successfully, but these errors were encountered: