Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[QUESTION] Spark versions compatibility for operator version v1beta2-1.6.x-3.5.0 #2100

Closed
1 task done
kraj007 opened this issue Jul 26, 2024 · 3 comments
Closed
1 task done
Labels
question Further information is requested

Comments

@kraj007
Copy link

kraj007 commented Jul 26, 2024

  • ✋ I have searched the open/closed issues and my issue is not listed.

Please describe your question here

As per documentation base spark version for operator version v1beta2-1.6.x-3.5.0 is 3.5.0.
What is mean by base spark version here ?
We are using spark 3.4.1 (cant upgrade to 3.5.0 as of now), we just tested once example application with operator version v1beta2-1.6.x-3.5.0 and spark version 3..4.1. So far no issues. Does it mean , operator operator version v1beta2-1.6.x-3.5.0 supports spark 3.4.1 also ?
If this is not recommended , which operator version we should use ?

Provide a link to the example/module related to the question

Additional context

@kraj007 kraj007 added the question Further information is requested label Jul 26, 2024
@ChenYi015
Copy link
Contributor

@kraj007 In Spark operator, it will use spark-submit script to submit spark applications to kubernetes, and this spark-submit script comes from Spark 3.5.0 (for operator version v1beta2-1.6.x-3.5.0). In most cases, it will be fine, since the spark-submit changes slightly between v3.4.x and v3.5.x.

@yuzhouliu9
Copy link

yuzhouliu9 commented Oct 1, 2024

@ChenYi015
To confirm, the SparkApplication spec sparkVersion: 3.4.1 still defines the spark version to use for a run.
It's just the spark-submit script used to submit SparkApplication is the updated one?

@jacobsalway
Copy link
Member

@yuzhouliu9 correct, the Spark version you've used in your application image is what it'll run with. The sparkVersion field on the CR actually isn't used for any submission logic right now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

4 participants