Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-49739][SQL] Throw Custom error condition for invalid options in SaveIntoDataSourceCommand class #48188

Closed

Conversation

ivanjevtic-db
Copy link
Contributor

What changes were proposed in this pull request?

NullPointerException was thrown when invalid options were provided in the run function of SaveIntoDataSourceCommand up to now. That is because this map is propagated here, and java.util.Properties throws NPE if key/value is null. In map, a key cannot be null, but a value can, so that is why this NPE is possible to produce.
I propose we send the client an appropriate error condition. I've created a new error condition INVALID_OPTIONS.NULL_VALUE. Options are checked, and the error is thrown if there exists an entry in options map where the value is null.

Why are the changes needed?

To send the client an appropriate error.

Does this PR introduce any user-facing change?

Yes, in case invalid options are sent, we throw INVALID_OPTIONS.NULL_VALUE instead of NullPointerException.

How was this patch tested?

Wrote a unit test.

Was this patch authored or co-authored using generative AI tooling?

No.

@github-actions github-actions bot added the SQL label Sep 20, 2024
@@ -46,6 +46,10 @@ case class SaveIntoDataSourceCommand(
override def innerChildren: Seq[QueryPlan[_]] = Seq(query)

override def run(sparkSession: SparkSession): Seq[Row] = {
if (options.values.exists(_ == null)) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about keys? Should we check them too?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, the keys cannot be null since Map[String, String] can't accept null keys.

},
"NULL_VALUE" : {
"message" : [
"Values of keys and values in `map()` must be string, but got null in value."
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Users can build values dynamically, and could be not clear which one is null. Let's help them in troubleshooting, and provide key which has the null value.

@@ -2752,6 +2752,11 @@
"message" : [
"A type of keys and values in `map()` must be string, but got <mapType>."
]
},
"NULL_VALUE" : {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about to re-use the existing error condition: NULL_DATA_SOURCE_OPTION?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i actually missed this condition, it makes more sense

@stevomitric
Copy link
Contributor

stevomitric commented Sep 22, 2024

Master branch is guarded from the NullPointerException caused by SPARK-49739 here. We should consider backporting the fix from #46955.

@ivanjevtic-db
Copy link
Contributor Author

Closing the pr because #46955 solves it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants