Skip to content

Release to Cloud

Release to Cloud #7

# What?
#
# Tag and release an arbitrary sha. Uploads to an internal archive for further processing.
#
# How?
#
# After checking out and testing the provided sha, the image is built and uploaded.
#
# When?
#
# Manual trigger.
name: "Release internal patch"
on:
workflow_dispatch:
inputs:
version_number:
description: "The release version number (i.e. 1.0.0b1)"
type: string
required: true
sha:
description: "The ref (sha or branch name) to use"
type: string
default: "main"
required: true
package_test_command:
description: "Package test command"
type: string
default: "python -c \"import dbt.adapters.spark\""
required: true
defaults:
run:
shell: "bash"
env:
PYTHON_TARGET_VERSION: 3.11
jobs:
invoke-reusable-workflow:
name: "Build and Release Internally"
uses: "dbt-labs/dbt-release/.github/workflows/internal-archive-release.yml@mp/finish_internal_workflow"

Check failure on line 44 in .github/workflows/release-internal.yml

View workflow run for this annotation

GitHub Actions / Release internal patch

Invalid workflow file

The workflow is not valid. In .github/workflows/release-internal.yml (Line: 44, Col: 11): Error from called workflow dbt-labs/dbt-release/.github/workflows/internal-archive-release.yml@mp/finish_internal_workflow (Line: 198, Col: 9): Unexpected symbol: '#'. Located at position 14 within expression: success() || # Spark runs tests via its in-repo workflow inputs.dbms_name == 'spark' || # Default behavior is to build artifacts as long as integration tests are invoked some way (needs.run-integration-tests.result == 'success' || needs.run-integration-tests-hatch.result == 'success')
with:
version_number: "${{ inputs.version_number }}"
package_test_command: "${{ inputs.package_test_command }}"
dbms_name: "spark"
sha: "${{ inputs.sha }}"
secrets: "inherit"