We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I was trying to compile it for Spark 3.2.0. It's failing with the following errors:
#6 219.9 [info] Compiling 67 Scala sources to /tmp/spark-sql-perf/target/scala-2.13/classes... #6 221.2 [info] 'compiler-interface' not yet compiled for Scala 2.13.1. Compiling... #6 229.1 12 warnings found #6 229.1 [info] Compilation completed in 7.939 s #6 231.2 [error] /tmp/spark-sql-perf/src/main/scala/com/databricks/spark/sql/perf/Benchmark.scala:478: overloaded method value createDataFrame with alternatives: #6 231.2 [error] [A <: Product](data: Seq[A])(implicit evidence$2: reflect.runtime.universe.TypeTag[A])org.apache.spark.sql.DataFrame <and> #6 231.2 [error] [A <: Product](rdd: org.apache.spark.rdd.RDD[A])(implicit evidence$1: reflect.runtime.universe.TypeTag[A])org.apache.spark.sql.DataFrame #6 231.2 [error] cannot be applied to (scala.collection.mutable.ArrayBuffer[com.databricks.spark.sql.perf.BenchmarkResult]) #6 231.2 [error] val tbl = sqlContext.createDataFrame(currentResults) #6 231.2 [error] ^ #6 231.3 [error] /tmp/spark-sql-perf/src/main/scala/com/databricks/spark/sql/perf/Benchmark.scala:485: overloaded method value createDataFrame with alternatives: #6 231.3 [error] [A <: Product](data: Seq[A])(implicit evidence$2: reflect.runtime.universe.TypeTag[A])org.apache.spark.sql.DataFrame <and> #6 231.3 [error] [A <: Product](rdd: org.apache.spark.rdd.RDD[A])(implicit evidence$1: reflect.runtime.universe.TypeTag[A])org.apache.spark.sql.DataFrame #6 231.3 [error] cannot be applied to (scala.collection.mutable.ArrayBuffer[com.databricks.spark.sql.perf.ExperimentRun]) #6 231.3 [error] val tbl = sqlContext.createDataFrame(currentRuns) #6 231.3 [error] ^ #6 231.3 [error] /tmp/spark-sql-perf/src/main/scala/com/databricks/spark/sql/perf/Benchmarkable.scala:88: value getStackTraceString is not a member of Throwable #6 231.3 [error] e.getMessage + ":\n" + e.getStackTraceString))) #6 231.3 [error] ^ #6 232.3 [error] /tmp/spark-sql-perf/src/main/scala/com/databricks/spark/sql/perf/mllib/MLPipelineStageBenchmarkable.scala:40: value getStackTraceString is not a member of Throwable #6 232.3 [error] println(s"$this error in beforeBenchmark: ${e.getStackTraceString}") #6 232.3 [error] ^ #6 232.4 [error] /tmp/spark-sql-perf/src/main/scala/com/databricks/spark/sql/perf/mllib/MLPipelineStageBenchmarkable.scala:106: value getStackTraceString is not a member of Exception #6 232.4 [error] e.getMessage + ":\n" + e.getStackTraceString))) #6 232.4 [error] ^ #6 233.6 [error] /tmp/spark-sql-perf/src/main/scala/org/apache/spark/ml/ModelBuilderSSP.scala:84: not found: type Pair #6 233.6 [error] extends RandomDataGenerator[Pair[Double, Double]] { #6 233.6 [error] ^ #6 233.6 [error] /tmp/spark-sql-perf/src/main/scala/org/apache/spark/ml/ModelBuilderSSP.scala:91: not found: type Pair #6 233.6 [error] override def nextValue(): Pair[Double, Double] = { #6 233.6 [error] ^ #6 233.6 [error] /tmp/spark-sql-perf/src/main/scala/org/apache/spark/ml/ModelBuilderSSP.scala:97: not found: type Pair #6 233.6 [error] new Pair[Double, Double](left, right) #6 233.6 [error] ^ #6 233.6 [error] /tmp/spark-sql-perf/src/main/scala/org/apache/spark/ml/ModelBuilderSSP.scala:112: not found: type Pair #6 233.6 [error] private class RealLabelPairGenerator() extends RandomDataGenerator[Pair[Double, Double]] { #6 233.6 [error] ^ #6 233.6 [error] /tmp/spark-sql-perf/src/main/scala/org/apache/spark/ml/ModelBuilderSSP.scala:116: not found: type Pair #6 233.6 [error] override def nextValue(): Pair[Double, Double] = #6 233.6 [error] ^ #6 233.6 [error] /tmp/spark-sql-perf/src/main/scala/org/apache/spark/ml/ModelBuilderSSP.scala:117: not found: type Pair #6 233.6 [error] new Pair[Double, Double](rng.nextDouble(), rng.nextDouble()) #6 233.6 [error] ^ #6 233.7 [error] /tmp/spark-sql-perf/src/main/scala/org/apache/spark/ml/ModelBuilderSSP.scala:188: not found: type Pair #6 233.7 [error] labelGenerator: RandomDataGenerator[Pair[Double, Double]], #6 233.7 [error] ^ #6 233.7 [error] 12 errors found #6 234.6 [error] (compile:compileIncremental) Compilation failed #6 234.6 [error] Total time: 56 s, completed Nov 4, 2021 12:50:55 PM #6 ERROR: executor failed running [/bin/sh -c git clone https://github.com/abin-tiger/spark-sql-perf.git -b spark-${SPARK_VERSION} /tmp/spark-sql-perf && cd /tmp/spark-sql-perf/ && sbt +package]: exit code: 1 ------ > [sbt 2/2] RUN git clone https://github.com/abin-tiger/spark-sql-perf.git -b spark-3.2.0 /tmp/spark-sql-perf && cd /tmp/spark-sql-perf/ && sbt +package: #6 232.3 [error] ^ #6 232.4 [error] /tmp/spark-sql-perf/src/main/scala/com/databricks/spark/sql/perf/mllib/MLPipelineStageBenchmarkable.scala:106: value getStackTraceString is not a member of Exception #6 232.4 [error] e.getMessage + ":\n" + e.getStackTraceString))) #6 232.4 [error] ^ #6 233.7 [error] labelGenerator: RandomDataGenerator[Pair[Double, Double]], #6 233.7 [error] ^ #6 233.7 [error] 12 errors found #6 234.6 [error] (compile:compileIncremental) Compilation failed #6 234.6 [error] Total time: 56 s, completed Nov 4, 2021 12:50:55 PM ------ error: failed to solve: executor failed running [/bin/sh -c git clone https://github.com/abin-tiger/spark-sql-perf.git -b spark-${SPARK_VERSION} /tmp/spark-sql-perf && cd /tmp/spark-sql-perf/ && sbt +package]: exit code: 1 Error: buildx call failed with: error: failed to solve: executor failed running [/bin/sh -c git clone https://github.com/abin-tiger/spark-sql-perf.git -b spark-${SPARK_VERSION} /tmp/spark-sql-perf && cd /tmp/spark-sql-perf/ && sbt +package]: exit code: 1
My version of repository: https://github.com/abin-tiger/spark-sql-perf/tree/spark-3.2.0 (Only changes are build.sbt)
The text was updated successfully, but these errors were encountered:
No branches or pull requests
I was trying to compile it for Spark 3.2.0. It's failing with the following errors:
My version of repository: https://github.com/abin-tiger/spark-sql-perf/tree/spark-3.2.0 (Only changes are build.sbt)
The text was updated successfully, but these errors were encountered: