What is the default number of cores and amount of memory allocated to an application in Spark?
September 18, 2023How does a consumer know the offset to read after restart in Kafka?
October 2, 2023This is going to be a short post.
Number of executors in YARN deployments
Spark.executor.instances controls the number of executors in YARN. By default, the number of executors is 2.
Number of executors in Standalone deployments
Spark standalone mode requires each application to run an executor on every node in the cluster, whereas with YARN, you choose the number of executors to use.
You can however control the number of cores available by each executor but spark.executor.cores and spark.cores.max.
Number of cores allocated will be spark.executor.cores or spark.cores.max if spark.executor.cores is greater than spark.cores.max.
Check out this post if you are interested in learning the default number of cores and amount of memory allocated to an application in Spark?