Build #39
build_main.yml
on: push
Run
/
Check changes
42s
Run
/
Protobuf breaking change detection and Python CodeGen check
1m 6s
Run
/
Run TPC-DS queries with SF=1
46m 57s
Run
/
Run Docker integration tests
40m 41s
Run
/
Run Spark on Kubernetes Integration test
58m 21s
Run
/
Run Spark UI tests
21s
Matrix: Run / build
Matrix: Run / maven-build
Run
/
Build modules: sparkr
26m 40s
Run
/
Linters, licenses, dependencies and documentation generation
55m 23s
Matrix: Run / pyspark
Annotations
12 errors and 2 warnings
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-b736db8e635f4a94-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-5d065e8e636030d6-exec-1".
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$621/0x00007fc67c4e8fb0@382eed25 rejected from java.util.concurrent.ThreadPoolExecutor@604d16a7[Shutting down, pool size = 3, active threads = 2, queued tasks = 0, completed tasks = 371]
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$621/0x00007fc67c4e8fb0@386296f7 rejected from java.util.concurrent.ThreadPoolExecutor@604d16a7[Shutting down, pool size = 2, active threads = 1, queued tasks = 0, completed tasks = 372]
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-ea24758e63729f05-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-c73bd38e63738707-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-3f8ea38e63772348-exec-1".
|
Run / Run Spark on Kubernetes Integration test
Status(apiVersion=v1, code=404, details=StatusDetails(causes=[], group=null, kind=pods, name=spark-test-app-79c152c46fb4448ba7b88a76f2d449c9-driver, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "spark-test-app-79c152c46fb4448ba7b88a76f2d449c9-driver" not found, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=NotFound, status=Failure, additionalProperties={})..
|
Run / Build modules: pyspark-mllib, pyspark-ml, pyspark-ml-connect
The job running on runner GitHub Actions 13 has exceeded the maximum execution time of 300 minutes.
|
Run / Build modules: pyspark-mllib, pyspark-ml, pyspark-ml-connect
The operation was canceled.
|
Run / Protobuf breaking change detection and Python CodeGen check
Node.js 16 actions are deprecated. Please update the following actions to use Node.js 20: bufbuild/buf-lint-action@v1, bufbuild/buf-breaking-action@v1. For more information see: https://github.blog/changelog/2023-09-22-github-actions-transitioning-from-node-16-to-node-20/.
|
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Artifacts
Produced during runtime
Name | Size | |
---|---|---|
test-results-pyspark-mllib, pyspark-ml, pyspark-ml-connect--17-hadoop3-hive2.3-python3.9
Expired
|
71 KB |
|
unit-tests-log-pyspark-mllib, pyspark-ml, pyspark-ml-connect--17-hadoop3-hive2.3-python3.9
Expired
|
165 KB |
|