Skip to content

Build (master, Scala 2.13, Hadoop 3, JDK 21) #443

Build (master, Scala 2.13, Hadoop 3, JDK 21)

Build (master, Scala 2.13, Hadoop 3, JDK 21) #443

Re-run triggered September 4, 2024 01:26
Status Success
Total duration 1h 14m 46s
Artifacts 21

build_java21.yml

on: schedule
Run  /  Check changes
32s
Run / Check changes
Run  /  Base image build
55s
Run / Base image build
Run  /  Protobuf breaking change detection and Python CodeGen check
58s
Run / Protobuf breaking change detection and Python CodeGen check
Run  /  Run TPC-DS queries with SF=1
1h 32m
Run / Run TPC-DS queries with SF=1
Run  /  Run Docker integration tests
1h 17m
Run / Run Docker integration tests
Run  /  Run Spark on Kubernetes Integration test
1h 0m
Run / Run Spark on Kubernetes Integration test
Run  /  Run Spark UI tests
27s
Run / Run Spark UI tests
Matrix: Run / build
Run  /  Build modules: sparkr
25m 41s
Run / Build modules: sparkr
Run  /  Linters, licenses, and dependencies
0s
Run / Linters, licenses, and dependencies
Run  /  Documentation generation
0s
Run / Documentation generation
Matrix: Run / pyspark
Fit to window
Zoom out
Zoom in

Annotations

10 errors and 1 warning
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-5f960f91b62361ec-exec-1".
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-53c62591b6244a8f-exec-1".
Run / Run Spark on Kubernetes Integration test
sleep interrupted
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda/0x00007f04204f8220@580835f7 rejected from java.util.concurrent.ThreadPoolExecutor@7943b3ed[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 363]
Run / Run Spark on Kubernetes Integration test
sleep interrupted
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda/0x00007f04204f8220@aa86987 rejected from java.util.concurrent.ThreadPoolExecutor@7943b3ed[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 364]
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-c0b0d391b63733f5-exec-1".
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-c1ecd991b6381f95-exec-1".
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-36881791b63bcc68-exec-1".
Run / Run Spark on Kubernetes Integration test
Status(apiVersion=v1, code=404, details=StatusDetails(causes=[], group=null, kind=pods, name=spark-test-app-cb3297c7a3ab435083609f8eb96a2a97-driver, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "spark-test-app-cb3297c7a3ab435083609f8eb96a2a97-driver" not found, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=NotFound, status=Failure, additionalProperties={})..
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.

Artifacts

Produced during runtime
Name Size
test-results-core, unsafe, kvstore, avro, utils, network-common, network-shuffle, repl, launcher, examples, sketch, variant--21-hadoop3-hive2.3
798 KB
test-results-streaming, sql-kafka-0-10, streaming-kafka-0-10, streaming-kinesis-asl, kubernetes, hadoop-cloud, spark-ganglia-lgpl, protobuf, connect--21-hadoop3-hive2.3
364 KB