-
Notifications
You must be signed in to change notification settings - Fork 946
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] [Spark] Support Spark 4.0 (preview) #3940
Comments
@ulysses-you we can discuss this here. |
thank you @YannByron for the guide. I looked at Spark 4.0.0-preview, the main challenge is scala2.13. Others like JDK17, inferface changes are not big issues. For scala2.13, as far as I can see, Spark community paid a huge cost to support it and drop the scala2.12, and even for now there are some performance regression due to scala2.13, so I think it affects Paimon much. For my self, I perfer to copy cc @JingsongLi what do you think about? |
You can use "com.thoughtworks.enableIf" for multi versions of scala |
Hi @ulysses-you @YannByron , I would like to ask whether |
maybe we can allow This approach doesn't allow compile both spark 3.x and spark4.x at the same time and we have to modify something like CI. But this can avoid copying codes and make more reuse. Meanwhile, @JingsongLi @ulysses-you WDYT~ |
The main issue of reuse module to me is we need to compile spark twice for different scala version. But I'm +1 for @YannByron if you are fine with it. |
@YannByron This approach just like Flink with two scala versions. I am OK with it~ |
Search before asking
Motivation
Support Spark4.0 (preview1)
Solution
No response
Anything else?
No response
Are you willing to submit a PR?
The text was updated successfully, but these errors were encountered: