We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SPARK_WORKER_INSTANCES http://spark.apache.org/docs/latest/spark-standalone.html#cluster-launch-scripts 指定这个参数是不是就可以了?
The text was updated successfully, but these errors were encountered:
SPARK_WORKER_INSTANCES 是配置一个node 上可以运行多少个Worker,一般一个Worker可以运行一个CoarseGrainedExecutorBackend进程,我的问题是如何配置一个Worker可以运行多个CoarseGrainedExecutorBackend
Sorry, something went wrong.
要想实现一个Worker运行多个CoarseGrainedExecutorBackend,需要将Worker的资源数设置为Executor资源数的N倍
No branches or pull requests
SPARK_WORKER_INSTANCES
http://spark.apache.org/docs/latest/spark-standalone.html#cluster-launch-scripts
指定这个参数是不是就可以了?
The text was updated successfully, but these errors were encountered: