Skip to content
This repository has been archived by the owner on Feb 3, 2021. It is now read-only.

Feature: Readthedocs support #497

Merged
merged 43 commits into from
Apr 26, 2018
Merged

Feature: Readthedocs support #497

merged 43 commits into from
Apr 26, 2018

Conversation

timotheeguerin
Copy link
Member

@timotheeguerin timotheeguerin commented Apr 18, 2018


```python
# define a custom script
custom_script = aztk.spark.models.CustomScript(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since custom scripts are going away in favor of plugins, we should probably leave this out.

status = client.get_application_status(cluster_config.cluster_id, app2.name)
```

## stream logs of app, print to console as it runs
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"stream" should be "Stream"

```


## Run application against cluster
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I feel like "Run an application on the cluster" is better than "against".

docs/index.rst Outdated
@@ -0,0 +1,40 @@
Welcome to aztk's documentation!
================================
Azure Distributed Data Engineering Toolkit (AZTK) is a python CLI application for provisioning on-demand Spark on Docker clusters in Azure. It's a cheap and easy way to get up and running with a Spark cluster, and a great tool for Spark users who want to experiment and start testing at scale.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should pick between aztk and AZTK and standardize in our docs.

@jafreck
Copy link
Member

jafreck commented Apr 23, 2018

In the SDK docs, I think the only package we should have is the aztk.spark package with only aztk.spark.models and aztk.spark.client underneath it (not utils). Those should be the only public facing modules.

edit: and the aztk.error module

@@ -13,25 +13,25 @@ Creating a Job starts with defining the necessary properties in your `.aztk/job.
Each Job has one or more applications given as a List in Job.yaml. Applications are defined using the following properties:
```yaml
applications:
- name:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought we decided to leave in white spaces in yaml since it is required.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hhm i guess its the auto formatting that removed those

@@ -13,6 +13,14 @@ pylint==1.8.2
pytest==3.1.3
pytest-xdist==1.22.0
twine==1.9.1
docker==3.2.1
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why is this here?

@timotheeguerin timotheeguerin merged commit e361c3b into master Apr 26, 2018
@timotheeguerin timotheeguerin deleted the feature/readthedocs branch April 26, 2018 22:22
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Setup readthedocs
3 participants