Skip to content

feat(spark): implement Spark datetime function last_day #16828

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

Standing-Man
Copy link
Contributor

@Standing-Man Standing-Man commented Jul 20, 2025

Which issue does this PR close?

Rationale for this change

What changes are included in this PR?

Implement Spark datetime function last_day

Are these changes tested?

I added tests for last_day function.

Are there any user-facing changes?

yes, new function.

@github-actions github-actions bot added sqllogictest SQL Logic Tests (.slt) spark labels Jul 20, 2025
@Standing-Man
Copy link
Contributor Author

Hi @alamb, I’ve added the last_day function. However, running cargo test --test sqllogictests -- spark produces some errors. I’m looking into it, but please let me know if you have any insights.

1. query failed: DataFusion error: Error during planning: Invalid function 'last_day'.
Did you mean 'list_cat'?

@alamb
Copy link
Contributor

alamb commented Jul 21, 2025

Hi @alamb, I’ve added the last_day function. However, running cargo test --test sqllogictests -- spark produces some errors. I’m looking into it, but please let me know if you have any insights.

1. query failed: DataFusion error: Error during planning: Invalid function 'last_day'.
Did you mean 'list_cat'?

I think the issue is that the function is named spark_last_day in the PR (rather than last_day)

    fn name(&self) -> &str {
        "spark_last_day"
    }

impl SparkLastDay {
pub fn new() -> Self {
Self {
signature: Signature::user_defined(Volatility::Immutable),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you can set the signature to be taking exactly one Date32 type.
After, the planning routine will automatically do all the validations and coercions, then:

  • We can assume inside invoke_with_args() the input must have valid type, so those exec_errs can be changed to internal_err just for sanity check
  • No need to implement coerce_types()



query ?
SELECT NULL;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it select last_day(null)?

perhaps we can add more bad case tests like

select last_day('foo');
select last_day(123);
select last_day();
select last_day(last_day('2016-02-07'::string, 'foo');
select last_day(last_day('2016-02-31'::string);

And ensure it's returning expected errors.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
spark sqllogictest SQL Logic Tests (.slt)
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[datafusion-spark] Implement Spark datetime function last_day
3 participants