Group-level session sharing, how to increase the number of parallel tasks #5155
Answered
by
pan3793
shizhengchao
asked this question in
Q&A
-
If ''spark.dynamicAllocation.maxExecutors'' is set to 400, then only 400 tasks can be run at the same time, and other newly submitted SQL will be blocked. How to solve this concurrency problem |
Beta Was this translation helpful? Give feedback.
Answered by
pan3793
Aug 11, 2023
Replies: 1 comment 2 replies
-
Increasing executor cores, typically, |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
https://spark.apache.org/docs/latest/job-scheduling.html#scheduling-within-an-application