Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-50858][PYTHON] Add configuration to hide Python UDF stack trace #49535

Open
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

wengh
Copy link

@wengh wengh commented Jan 17, 2025

What changes were proposed in this pull request?

Add new configuration spark.sql.execution.pyspark.udf.hideTraceback.enabled. If set, when handling an exception from Python UDF, only the exception class and message are included. The configuration is turned off by default.

Suggested review order:

  1. python/pyspark/util.py: logic changes
  2. python/pyspark/tests/test_util.py: unit tests
  3. other files: adding new configuration

Why are the changes needed?

This allows library provided UDFs to show only the relevant message without unnecessary stack trace.

Does this PR introduce any user-facing change?

If the configuration is turned off, no user change.
Otherwise, the stack trace is not included in the error message when handling an exception from Python UDF.

Example that illustrates the difference
from pyspark.errors.exceptions.base import PySparkRuntimeError
from pyspark.sql.types import IntegerType, StructField, StructType
from pyspark.sql.udtf import AnalyzeArgument, AnalyzeResult
from pyspark.sql.functions import udtf


@udtf()
class PythonUDTF:
    @staticmethod
    def analyze(x: AnalyzeArgument) -> AnalyzeResult:
        raise PySparkRuntimeError("[XXX] My PySpark runtime error.")

    def eval(self, x: int):
        yield (x,)


spark.udtf.register("my_udtf", PythonUDTF)
spark.sql("select * from my_udtf(1)").show()

With configuration turned off, the last line gives:

...
pyspark.errors.exceptions.captured.AnalysisException: [TABLE_VALUED_FUNCTION_FAILED_TO_ANALYZE_IN_PYTHON] Failed to analyze the Python user defined table function: Traceback (most recent call last):
  File "<stdin>", line 7, in analyze
pyspark.errors.exceptions.base.PySparkRuntimeError: [XXX] My PySpark runtime error. SQLSTATE: 38000; line 1 pos 14

With configuration turned on, the last line gives:

...
pyspark.errors.exceptions.captured.AnalysisException: [TABLE_VALUED_FUNCTION_FAILED_TO_ANALYZE_IN_PYTHON] Failed to analyze the Python user defined table function: pyspark.errors.exceptions.base.PySparkRuntimeError: [XXX] My PySpark runtime error. SQLSTATE: 38000; line 1 pos 14

How was this patch tested?

Added unit test in python/pyspark/tests/test_util.py, testing two cases with the configuration turned on and off respectively.

Was this patch authored or co-authored using generative AI tooling?

No

@wengh wengh force-pushed the spark-50858-hide-udf-stack-trace branch from c46ecf5 to 174fdea Compare January 17, 2025 00:17
@HyukjinKwon
Copy link
Member

I think we should show the UDF exception. Otherwise users won't know what's going on and why their job fails.

@allisonwang-db
Copy link
Contributor

allisonwang-db commented Jan 17, 2025

@HyukjinKwon I agree, and this PR is just to make it configurable (with the default value set to false - show stack trace by default). There are many user-friendly errors on the Python side, but they are often buried in a long Python-side stack trace. This change is intended to optionally hide these stack traces to improve the user experience.

@wengh wengh marked this pull request as ready for review January 17, 2025 18:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants