Replies: 4 comments 1 reply
-
Hi, @tgravescs The batch engine is not interactive and it is just a normal spark application, which depends on the resource jar you provided when create a batch(POST /batches). |
Beta Was this translation helpful? Give feedback.
-
hi @tgravescs, do you want to submit some code script and run it with the same engine connected by JDBC? If that, you can try to execute |
Beta Was this translation helpful? Give feedback.
-
The batch submission job REST does not support running multiple spark jobs in the same app yet. Currently, Another possible solution is Spark Connect, which may make us easy to do that, but it requires the spark community to complete this feature first |
Beta Was this translation helpful? Give feedback.
-
thanks for the responses. Yeah I'm looking at Spark connect as well, but what I'm thinking about is probably more like the Spark Job Server use case where you could have a single Spark application deployed (setup with like fair scheduler) and allow multiple batch jobs to run on it. When I say batch jobs I mean user submitting python files like you would with spark-submit. Ideally it would be nice if you could have jdbc and batch jobs together. |
Beta Was this translation helpful? Give feedback.
-
Hello,
I was looking at the new batch submission REST API and I was wondering if with that API it uses the same share level as the JDBC connections? Meaning if I configure it in USER level share and I have one connection open to do JDBC and then I use the REST Api to submit a batch job, will it use the same engine on the backend? In this case I'm using Spark.
Ideally I want the ability to re-use the same engine for multiple batch jobs.
Thanks,
Tom
Beta Was this translation helpful? Give feedback.
All reactions