Skip to content
This repository has been archived by the owner on Feb 3, 2021. It is now read-only.

Feature: enable dynamic allocation by default #386

Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion cli/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -308,7 +308,7 @@ def _merge_dict(self, config):
self.spark_defaults_conf = self.__convert_to_path(spark_configuration.get('spark_defaults_conf'))
self.spark_env_sh = self.__convert_to_path(spark_configuration.get('spark_env_sh'))
self.core_site_xml = self.__convert_to_path(spark_configuration.get('core_site_xml'))
self.jars = [self.__convert_to_path(jar) for jar in spark_configuration.get('jars')]
self.jars = [self.__convert_to_path(jar) for jar in spark_configuration.get('jars') or []]

def __convert_to_path(self, str_path):
if str_path:
Expand Down
3 changes: 3 additions & 0 deletions config/spark-defaults.conf
Original file line number Diff line number Diff line change
Expand Up @@ -31,3 +31,6 @@ spark.jars /home/spark-current/jars/azure-storage-2.0.0.jar
# Note: Default filesystem master HA
spark.deploy.recoveryMode FILESYSTEM
spark.deploy.recoveryDirectory /root/

# enable dynamic allocation
spark.dynamicAllocation.enabled true