Skip to content
This repository has been archived by the owner on Feb 3, 2021. It is now read-only.

Docs: update #263

Merged
merged 13 commits into from
Dec 12, 2017
Merged

Docs: update #263

merged 13 commits into from
Dec 12, 2017

Conversation

jiata
Copy link
Contributor

@jiata jiata commented Dec 10, 2017

  • streamline readme
  • remove old refs
  • misc fixes

@jiata jiata changed the title Docs/update Docs: update Dec 10, 2017
Most users will want to work interactively with their Spark clusters. With the `aztk spark cluster ssh` command, you can SSH into the cluster's master node. This command also helps you port-forward your Spark Web UI and Spark Jobs UI to your local machine:
```bash
aztk spark cluster ssh --id <my_cluster_id>
aztk spark cluster ssh --id my_cluster --user spark
```
By default, we port forward the Spark Web UI to *localhost:8080*, Spark Jobs UI to *localhost:4040*, and the Spark History Server to *localhost:18080*.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Worth adding additional ports we are forwarding today? 8787 and 8888 for RStudioServer and Jupyter?

@jiata jiata merged commit 6091b1d into master Dec 12, 2017
@jiata jiata deleted the docs/update branch December 12, 2017 01:02
@jiata jiata removed the in progress label Dec 12, 2017
jafreck pushed a commit that referenced this pull request Jan 9, 2018
* remove redundant setting in non-master code section and use non-os drive to mount HDFS (#242)

* Feature: Azure Files (#241)

* initial take on installing azure files

* fix cluster.yaml parsing of files shares

* remove test code

* add docs for Azure Files

* Feature: Rename SDK (#231)

* initial refactor

* rename cli_fe to cli

* add docs for sdk client

* typo

* remove conflict

* fix zip node scripts bug, add sdk_example program

* start models docs

* add ClusterConfiguration docs, fix merge bug

* Application docs update

* added Application and SparkConfiguration docs

* whitespace

* rename cli.py and spark/cli

* add docstring for load_spark_client

* Bug: fix bad reference to FileShare (#245)

* Feature: Spark GPU (#206)

* conditionally install and use nvidia-docker

* status statements, and -y flag for install

* add example, remove unnecessary ppa

* rename custom script, remove print statement, update example

* add Dockerfile

* fix path in Dockerfile

* update Docker images to use service account

* updated docs, changed default docker repo for gpu skus

* make timing statements more verbose

* remove unnecessary script

* added gpu docs

* fix up docs and numba example

* Feature: update docker image doc (#251)

* update docker-image readme with new images

* update docs

* Update 60-gpu.md (#253)

* Update 60-gpu.md

make sure is available in region

* Update 60-gpu.md

* Feature: Sparklyr (#243)

* Added rstudio server script

* Added rstudio server port to aztk sdk

* Added R dockerfiles

* Added new line on dockerfiles

* Pointing dockerfiles to new aztk-base

* allow any user or application in the server to write to the history server log directory

* Retry asking for password when it doesn't match or is empty (#252)

* Retry asking for password when it doesn't match or is empty

* Limit to 3 retries and let user know of add-user command on failure

* Throw error on failure

* Bug: fix wrong path for global secrets (#265)

* fix wrong path for global secrets

* load spark_conf files correctly

* docker-image docs fix

* docker-image docs fix

* move load_aztk_spark_config function to config.py

* Feature: Default Spark filesystem master HA (#271)

* add default filesystem master ha

* move settings to spark-defaults.conf

* whitespace

* Docs: update (#263)

* Update README.md

streamline and update main readme.md

* Update README.md

* Update README.md

* Update 13-configuration.md

* Update 12-docker-image.md

* Update 12-docker-image.md

* Update README.md

* Create README.md

* Update README.md

* Update 10-clusters.md

* Feature: add feedback for cluster create wait (#273)

* add feedback for cluster create wait

* whitespace

* alphasort imports

* Bug: fix loading local spark config (#282)

* Fix secrets.yaml format and add service principal for storage

* Feature: update to v0.5.0 (#283)

* Pass credentials through to node scripts

* Bug: History server parse file not exist (#288)

* jupyter azfiles bug + gpu sample (#291)

* gpu sample + jupyter mnt point

* rename jupyter gpu sample

* Check for both service principal and shared key auth

* More checks

* Bug: fix logic for worker custom scripts (#295)

* Bug: suppress warning on add-user (#302)

* Bug: fix alignment in get print cluster (#312)
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants