Skip to content
This repository has been archived by the owner on Feb 3, 2021. It is now read-only.

Commit

Permalink
Internal: update vsts build (#635)
Browse files Browse the repository at this point in the history
* explicit style configuration, add pylint to vsts

* print yapf version

* typo

* make vsts use python 3.6

* remove debug statements

* separate unit and integration tests in vsts

* remove dev trigger

* whitespace

* up the number of parallel processes

* fix diagnostic tool return value

* reenable dev builds

* remove dev builds
  • Loading branch information
jafreck committed Aug 8, 2018
1 parent 7730c46 commit 6eda21e
Show file tree
Hide file tree
Showing 3 changed files with 19 additions and 8 deletions.
2 changes: 1 addition & 1 deletion .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ install:
- pip install -e .

script:
- yapf -dpr aztk/ aztk_cli/
- yapf --style .style.yapf -dpr aztk/ aztk_cli/
- pylint -E aztk
- pytest --ignore=tests/integration_tests

Expand Down
18 changes: 14 additions & 4 deletions .vsts-ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ phases:
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: '>= 3.5'
versionSpec: '3.6 >= 3.5'
addToPath: true
architecture: 'x64'

Expand All @@ -19,11 +19,21 @@ phases:
displayName: install aztk
- script: |
yapf -dpr aztk/ aztk_cli/
yapf --style .style.yapf -dpr aztk/ aztk_cli/
condition: succeeded()
displayName: yapf
- script: |
pytest -n 102
pylint -E aztk
condition: succeeded()
displayName: pytest
displayName: pylint
- script: |
pytest -n 20 --ignore=tests/integration_tests
condition: succeeded()
displayName: unit tests
- script: |
pytest -n 75
condition: succeeded()
displayName: integration tests
7 changes: 4 additions & 3 deletions aztk/spark/client/cluster/helpers/diagnostics.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,17 +13,18 @@ def _run(spark_cluster_operations, cluster_id, output_directory=None):
ssh_cmd = _build_diagnostic_ssh_command()
run_output = spark_cluster_operations.run(cluster_id, ssh_cmd, host=True)
remote_path = "/tmp/debug.zip"
result = None
if output_directory:
local_path = os.path.join(os.path.abspath(output_directory), "debug.zip")
output = spark_cluster_operations.download(cluster_id, remote_path, local_path, host=True)
result = spark_cluster_operations.download(cluster_id, remote_path, local_path, host=True)

# write run output to debug/ directory
with open(os.path.join(os.path.dirname(local_path), "debug-output.txt"), 'w', encoding="UTF-8") as f:
[f.write(line + '\n') for node_output in run_output for line in node_output.output]
else:
output = spark_cluster_operations.download(cluster_id, remote_path, host=True)
result = spark_cluster_operations.download(cluster_id, remote_path, host=True)

return output
return result


def _build_diagnostic_ssh_command():
Expand Down

0 comments on commit 6eda21e

Please sign in to comment.