-
Notifications
You must be signed in to change notification settings - Fork 66
Conversation
# # create a user for the cluster | ||
client.create_user(cluster.id, "sdk_example_user", "example_password") | ||
|
||
# create some apps to run |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
very minor - can we add one app which is scala/java as well?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We don't distribute the compiled java/scala programs in the repo. So if I do a java/scala example, the script will either not work out of the box or the java/scala app will need to be commented out.
Not sure which is more preferable, but I would lean towards leaving it commented out and having the script work by default.
setup.py
Outdated
@@ -16,7 +16,7 @@ | |||
], | |||
entry_points=dict( | |||
console_scripts=[ | |||
"{0} = aztk.cli:main".format(constants.CLI_EXE) | |||
"{0} = cli.cli:main".format(constants.CLI_EXE) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"cli.cli" looks a bit strange... shouldn't it just be cli:main ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have two minor pieces of feedback, otherwise this change looks good. I'm especially happy w/ the example file you added.
cli/spark/aztklib.py
Outdated
|
||
|
||
def load_spark_client(): | ||
secrets_config = config.SecretsConfig() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just add a small doc here saying this method create the client for you using the local user environment
* remove redundant setting in non-master code section and use non-os drive to mount HDFS (#242) * Feature: Azure Files (#241) * initial take on installing azure files * fix cluster.yaml parsing of files shares * remove test code * add docs for Azure Files * Feature: Rename SDK (#231) * initial refactor * rename cli_fe to cli * add docs for sdk client * typo * remove conflict * fix zip node scripts bug, add sdk_example program * start models docs * add ClusterConfiguration docs, fix merge bug * Application docs update * added Application and SparkConfiguration docs * whitespace * rename cli.py and spark/cli * add docstring for load_spark_client * Bug: fix bad reference to FileShare (#245) * Feature: Spark GPU (#206) * conditionally install and use nvidia-docker * status statements, and -y flag for install * add example, remove unnecessary ppa * rename custom script, remove print statement, update example * add Dockerfile * fix path in Dockerfile * update Docker images to use service account * updated docs, changed default docker repo for gpu skus * make timing statements more verbose * remove unnecessary script * added gpu docs * fix up docs and numba example * Feature: update docker image doc (#251) * update docker-image readme with new images * update docs * Update 60-gpu.md (#253) * Update 60-gpu.md make sure is available in region * Update 60-gpu.md * Feature: Sparklyr (#243) * Added rstudio server script * Added rstudio server port to aztk sdk * Added R dockerfiles * Added new line on dockerfiles * Pointing dockerfiles to new aztk-base * allow any user or application in the server to write to the history server log directory * Retry asking for password when it doesn't match or is empty (#252) * Retry asking for password when it doesn't match or is empty * Limit to 3 retries and let user know of add-user command on failure * Throw error on failure * Bug: fix wrong path for global secrets (#265) * fix wrong path for global secrets * load spark_conf files correctly * docker-image docs fix * docker-image docs fix * move load_aztk_spark_config function to config.py * Feature: Default Spark filesystem master HA (#271) * add default filesystem master ha * move settings to spark-defaults.conf * whitespace * Docs: update (#263) * Update README.md streamline and update main readme.md * Update README.md * Update README.md * Update 13-configuration.md * Update 12-docker-image.md * Update 12-docker-image.md * Update README.md * Create README.md * Update README.md * Update 10-clusters.md * Feature: add feedback for cluster create wait (#273) * add feedback for cluster create wait * whitespace * alphasort imports * Bug: fix loading local spark config (#282) * Fix secrets.yaml format and add service principal for storage * Feature: update to v0.5.0 (#283) * Pass credentials through to node scripts * Bug: History server parse file not exist (#288) * jupyter azfiles bug + gpu sample (#291) * gpu sample + jupyter mnt point * rename jupyter gpu sample * Check for both service principal and shared key auth * More checks * Bug: fix logic for worker custom scripts (#295) * Bug: suppress warning on add-user (#302) * Bug: fix alignment in get print cluster (#312)
Fix #230
Fix #217
Fix #191