Skip to content

[SPARK-46645][INFRA] Exclude unittest-xml-reporting in Python 3.12 image #139

[SPARK-46645][INFRA] Exclude unittest-xml-reporting in Python 3.12 image

[SPARK-46645][INFRA] Exclude unittest-xml-reporting in Python 3.12 image #139

GitHub Actions / Report test results failed Jan 10, 2024 in 0s

45739 tests run, 916 skipped, 5 failed.

Annotations

Check failure on line 1 in python/pyspark/sql/tests/pandas/test_pandas_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/pandas/test_pandas_map.py.test_other_than_dataframe_iter

[Errno 111] Connection refused
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_map.py", line 142, in test_other_than_dataframe_iter
    self.check_other_than_dataframe_iter()
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_map.py", line 163, in check_other_than_dataframe_iter
    (self.spark.range(10, numPartitions=3).mapInPandas(bad_iter_elem, "a int").count())
  File "/__w/spark/spark/python/pyspark/sql/dataframe.py", line 1379, in count
    return int(self._jdf.count())
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1322, in __call__
    return_value = get_return_value(
  File "/__w/spark/spark/python/pyspark/errors/exceptions/captured.py", line 215, in deco
    return f(*a, **kw)
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/protocol.py", line 334, in get_return_value
    raise Py4JError(
py4j.protocol.Py4JError: An error occurred while calling o888.count

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_map.py", line 142, in test_other_than_dataframe_iter
    self.check_other_than_dataframe_iter()
  File "/__w/spark/spark/python/pyspark/testing/utils.py", line 152, in __exit__
    self.log4j.LogManager.getRootLogger().setLevel(self.old_level)
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1657, in __getattr__
    answer = self._gateway_client.send_command(command)
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1036, in send_command
    connection = self._get_connection()
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/clientserver.py", line 284, in _get_connection
    connection = self._create_new_connection()
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/clientserver.py", line 291, in _create_new_connection
    connection.connect_to_java_server()
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/clientserver.py", line 438, in connect_to_java_server
    self.socket.connect((self.java_address, self.java_port))
ConnectionRefusedError: [Errno 111] Connection refused

Check failure on line 1 in python/pyspark/sql/tests/pandas/test_pandas_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/pandas/test_pandas_map.py.test_self_join

[Errno 111] Connection refused
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_map.py", line 377, in test_self_join
    df1 = self.spark.range(10, numPartitions=3)
  File "/__w/spark/spark/python/pyspark/sql/session.py", line 958, in range
    jdf = self._jsparkSession.range(0, int(start), int(step), int(numPartitions))
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1321, in __call__
    answer = self.gateway_client.send_command(command)
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1036, in send_command
    connection = self._get_connection()
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/clientserver.py", line 284, in _get_connection
    connection = self._create_new_connection()
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/clientserver.py", line 291, in _create_new_connection
    connection.connect_to_java_server()
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/clientserver.py", line 438, in connect_to_java_server
    self.socket.connect((self.java_address, self.java_port))
ConnectionRefusedError: [Errno 111] Connection refused

Check failure on line 1 in python/pyspark/sql/tests/pandas/test_pandas_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/pandas/test_pandas_map.py.tearDownClass (pyspark.sql.tests.pandas.test_pandas_map.MapInPandasTests)

[Errno 111] Connection refused
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/sql/tests/pandas/test_pandas_map.py", line 425, in tearDownClass
    ReusedSQLTestCase.tearDownClass()
  File "/__w/spark/spark/python/pyspark/testing/sqlutils.py", line 269, in tearDownClass
    super(ReusedSQLTestCase, cls).tearDownClass()
  File "/__w/spark/spark/python/pyspark/testing/utils.py", line 180, in tearDownClass
    cls.sc.stop()
  File "/__w/spark/spark/python/pyspark/context.py", line 654, in stop
    self._jsc.stop()
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1321, in __call__
    answer = self.gateway_client.send_command(command)
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1036, in send_command
    connection = self._get_connection()
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/clientserver.py", line 284, in _get_connection
    connection = self._create_new_connection()
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/clientserver.py", line 291, in _create_new_connection
    connection.connect_to_java_server()
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/clientserver.py", line 438, in connect_to_java_server
    self.socket.connect((self.java_address, self.java_port))
ConnectionRefusedError: [Errno 111] Connection refused

Check failure on line 1 in python/pyspark/sql/tests/pandas/test_pandas_map.py

See this annotation in the file changed.

@github-actions github-actions / Report test results

python/pyspark/sql/tests/pandas/test_pandas_map.py.setUpClass (pyspark.testing.sqlutils.ReusedSQLTestCase)

[Errno 111] Connection refused
Raw output
Traceback (most recent call last):
  File "/__w/spark/spark/python/pyspark/testing/sqlutils.py", line 260, in setUpClass
    super(ReusedSQLTestCase, cls).setUpClass()
  File "/__w/spark/spark/python/pyspark/testing/utils.py", line 176, in setUpClass
    cls.sc = SparkContext("local[4]", cls.__name__, conf=cls.conf())
  File "/__w/spark/spark/python/pyspark/testing/utils.py", line 172, in conf
    return SparkConf()
  File "/__w/spark/spark/python/pyspark/conf.py", line 133, in __init__
    self._jconf = _jvm.SparkConf(loadDefaults)
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1712, in __getattr__
    answer = self._gateway_client.send_command(
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1036, in send_command
    connection = self._get_connection()
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/clientserver.py", line 284, in _get_connection
    connection = self._create_new_connection()
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/clientserver.py", line 291, in _create_new_connection
    connection.connect_to_java_server()
  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/clientserver.py", line 438, in connect_to_java_server
    self.socket.connect((self.java_address, self.java_port))
ConnectionRefusedError: [Errno 111] Connection refused

Check failure on line 219 in SparkSessionE2ESuite

See this annotation in the file changed.

@github-actions github-actions / Report test results

SparkSessionE2ESuite.interrupt tag

org.scalatest.exceptions.TestFailedDueToTimeoutException: The code passed to eventually never returned normally. Attempted 30 times over 20.042734915 seconds. Last failure message: ListBuffer("575818d1-dd8a-461f-bae5-e1843277c7cd") had length 1 instead of expected length 2 Interrupted operations: ListBuffer(575818d1-dd8a-461f-bae5-e1843277c7cd)..
Raw output
sbt.ForkMain$ForkError: org.scalatest.exceptions.TestFailedDueToTimeoutException: The code passed to eventually never returned normally. Attempted 30 times over 20.042734915 seconds. Last failure message: ListBuffer("575818d1-dd8a-461f-bae5-e1843277c7cd") had length 1 instead of expected length 2 Interrupted operations: ListBuffer(575818d1-dd8a-461f-bae5-e1843277c7cd)..
	at org.scalatest.enablers.Retrying$$anon$4.tryTryAgain$2(Retrying.scala:219)
	at org.scalatest.enablers.Retrying$$anon$4.retry(Retrying.scala:226)
	at org.scalatest.concurrent.Eventually.eventually(Eventually.scala:313)
	at org.scalatest.concurrent.Eventually.eventually$(Eventually.scala:312)
	at org.scalatest.concurrent.Eventually$.eventually(Eventually.scala:457)
	at org.apache.spark.sql.SparkSessionE2ESuite.$anonfun$new$16(SparkSessionE2ESuite.scala:216)
	at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
	at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
	at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
	at org.scalatest.Transformer.apply(Transformer.scala:22)
	at org.scalatest.Transformer.apply(Transformer.scala:20)
	at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)
	at org.scalatest.TestSuite.withFixture(TestSuite.scala:196)
	at org.scalatest.TestSuite.withFixture$(TestSuite.scala:195)
	at org.scalatest.funsuite.AnyFunSuite.withFixture(AnyFunSuite.scala:1564)
	at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)
	at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236)
	at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
	at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)
	at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)
	at org.scalatest.funsuite.AnyFunSuite.runTest(AnyFunSuite.scala:1564)
	at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)
	at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)
	at scala.collection.immutable.List.foreach(List.scala:333)
	at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
	at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)
	at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)
	at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)
	at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)
	at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1564)
	at org.scalatest.Suite.run(Suite.scala:1114)
	at org.scalatest.Suite.run$(Suite.scala:1096)
	at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1564)
	at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)
	at org.scalatest.SuperEngine.runImpl(Engine.scala:535)
	at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)
	at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)
	at org.apache.spark.sql.SparkSessionE2ESuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkSessionE2ESuite.scala:37)
	at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
	at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
	at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
	at org.apache.spark.sql.SparkSessionE2ESuite.run(SparkSessionE2ESuite.scala:37)
	at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:321)
	at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:517)
	at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:414)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
	at java.base/java.lang.Thread.run(Thread.java:840)
Caused by: sbt.ForkMain$ForkError: org.scalatest.exceptions.TestFailedException: ListBuffer("575818d1-dd8a-461f-bae5-e1843277c7cd") had length 1 instead of expected length 2 Interrupted operations: ListBuffer(575818d1-dd8a-461f-bae5-e1843277c7cd).
	at org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:472)
	at org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:471)
	at org.scalatest.Assertions$.newAssertionFailedException(Assertions.scala:1231)
	at org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:1295)
	at org.apache.spark.sql.SparkSessionE2ESuite.$anonfun$new$28(SparkSessionE2ESuite.scala:219)
	at org.scalatest.enablers.Retrying$$anon$4.makeAValiantAttempt$1(Retrying.scala:184)
	at org.scalatest.enablers.Retrying$$anon$4.tryTryAgain$2(Retrying.scala:196)
	... 48 more