Skip to content

Commit

Permalink
Add on-cluster workflow test to CI (#887)
Browse files Browse the repository at this point in the history
**Pull Request Checklist**
- [x] Fixes #652
- [x] Tests added
- [x] Documentation/examples added
- [x] [Good commit messages](https://cbea.ms/git-commit/) and/or PR
title

**Description of PR**
Introduce local Argo cluster workflow tests to CI, and `make` targets to
set up a local cluster for testing (this worked very well on GitHub
codespaces, where I could even get a web UI to the cluster).

Added a single test to confirm the workflow runs via `create` and
reaches the `succeeded` `status.phase`. We can think about creating a
more flexible/extensible framework for adding `on_cluster` tests -
* I'd like to use a subset of our examples
* But we want to assert on more than the `status.phase`, e.g. output
values, running order, skipped nodes etc, which would require bespoke
tests for each
* It would also be good to have a wrapper for fetching these values,
e.g. getting a node requires iterating through the `status.nodes`
dictionary, which is keyed by hashes, so you have to check `displayName`
from the values.

---------

Signed-off-by: Elliot Gunton <egunton@bloomberg.net>
  • Loading branch information
elliotgunton committed Dec 12, 2023
1 parent 19fa389 commit a8edc84
Show file tree
Hide file tree
Showing 5 changed files with 110 additions and 2 deletions.
36 changes: 36 additions & 0 deletions .github/workflows/cicd.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -91,6 +91,42 @@ jobs:
with:
file: ./coverage.xml

workflow-tests:
name: run workflow tests
timeout-minutes: 10
strategy:
fail-fast: false

runs-on: ubuntu-latest

steps:
- name: checkout
uses: actions/checkout@v4

- name: Install poetry
run: pipx install poetry

- name: setup python 3.8
uses: actions/setup-python@v5
with:
python-version: 3.8
cache: "poetry"

- name: Install dependencies
run: poetry install

- name: setup k3d cluster
run: make install-k3d

- name: setup and run argo
run: make run-argo

- name: run workflow tests
run: make test-on-cluster

- name: stop argo cluster
run: make stop-argo

concurrency:
group: ${{ github.workflow }}-${{ github.ref || github.run_id }}
cancel-in-progress: true
41 changes: 40 additions & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ lint: ## Run a `lint` process on Hera and report problems

.PHONY: test
test: ## Run tests for Hera
@poetry run python -m pytest --cov-report=term-missing
@poetry run python -m pytest --cov-report=term-missing -m "not on_cluster"

.PHONY: workflows-models
workflows-models: ## Generate the Workflows models portion of Argo Workflows
Expand Down Expand Up @@ -126,3 +126,42 @@ regenerate-test-data: install-3.8
find examples -name "*.yaml" -type f -delete
HERA_REGENERATE=1 make test examples
@poetry run python -m pytest -k test_for_missing_examples --runxfail

.PHONY: install-k3d
install-k3d: ## Install k3d client
curl -s https://raw.githubusercontent.com/k3d-io/k3d/main/install.sh | bash

.PHONY: install-argo
install-argo: ## Install argo client
# Download the binary
curl -sLO https://github.com/argoproj/argo-workflows/releases/download/v$(ARGO_WORKFLOWS_VERSION)/argo-linux-amd64.gz

# Unzip
gunzip argo-linux-amd64.gz

# Make binary executable
chmod +x argo-linux-amd64

# Move binary to path
sudo mv ./argo-linux-amd64 /usr/local/bin/argo

# Test installation
argo version

.PHONY: run-argo
run-argo: ## Start the argo server
k3d cluster list | grep test-cluster || k3d cluster create test-cluster
k3d kubeconfig merge test-cluster --kubeconfig-switch-context
kubectl get namespace argo || kubectl create namespace argo
kubectl apply -n argo -f https://github.com/argoproj/argo-workflows/releases/download/v$(ARGO_WORKFLOWS_VERSION)/install.yaml
kubectl patch deployment argo-server --namespace argo --type='json' -p='[{"op": "replace", "path": "/spec/template/spec/containers/0/args", "value": ["server", "--auth-mode=server"]}]'
kubectl rollout status -n argo deployment/argo-server --timeout=120s --watch=true

.PHONY: stop-argo
stop-argo: ## Stop the argo server
k3d cluster stop test-cluster

.PHONY: test-on-cluster
test-on-cluster: ## Run workflow tests (requires local argo cluster)
@(kubectl -n argo port-forward deployment/argo-server 2746:2746 &)
@poetry run python -m pytest tests/test_submission.py -m on_cluster
2 changes: 1 addition & 1 deletion examples/workflows-examples.md
3 changes: 3 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,9 @@ filterwarnings = [
# Hide the hera.host_config deprecations
'ignore:.*is deprecated in favor of `global_config.GlobalConfig',
]
markers = [
"on_cluster: tests that run on an Argo cluster",
]

# Convert the following to config
[tool.mypy]
Expand Down
30 changes: 30 additions & 0 deletions tests/test_submission.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
import pytest

from hera.workflows import Steps, Workflow, WorkflowsService, script


@script()
def echo(message: str):
print(message)


def get_workflow() -> Workflow:
with Workflow(
generate_name="hello-world-",
entrypoint="steps",
namespace="argo",
workflows_service=WorkflowsService(
host="https://localhost:2746",
namespace="argo",
verify_ssl=False,
),
) as w:
with Steps(name="steps"):
echo(arguments={"message": "Hello world!"})
return w


@pytest.mark.on_cluster
def test_create_hello_world():
model_workflow = get_workflow().create(wait=True)
assert model_workflow.status and model_workflow.status.phase == "Succeeded"

0 comments on commit a8edc84

Please sign in to comment.