-
Hi, tell me please how I could run the spark-shell with dynamic resources by using Stackable Operator for Apache Spark? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
Hallo @adwiza, thank you for your question, though it is a little difficult to answer without more context. We don't document doing anything with spark shell directly, as the SparkApplication resources that our operator deploys are basically jobs/tasks in their own right, where pods are provisioned and then terminated. You would need to have some kind of standalone deployment to do that. This is basically what we do with our spark/jupyterhub-based demo, where a fixed number of executors are provisioned and run for as long as the kernel is active. Maybe you could derive something similar to this demo and then test what you want from within the notebook. You can find the notebook here. Or are you talking about dynamic resource allocation/scheduling, with something like Volcano? |
Beta Was this translation helpful? Give feedback.
We don't support spark-connect yet, but we have it on the radar if you want to track its progress: stackabletech/spark-k8s-operator#284