PySparkProcessor and spark.dynamicAllocation.maxExecutors conf #3593
Unanswered
andre-marcos-perez
asked this question in
Help
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello, I notice the
spark.dynamicAllocation.maxExecutors
is limited to the number of instances defined in thePySparkProcessor
object:So I guess the idea is to do the math to make sure Spark can scale the workers up to the instance count, given the instance type hardware configs and define workers configs. Is there a way to scale the amount of instances beyond the
instance_count
param?Beta Was this translation helpful? Give feedback.
All reactions