Skip to content

Commit

Permalink
Finish internal build workflow (#999)
Browse files Browse the repository at this point in the history
* Add workflow for spark

* Shape up workflow.

* Modify range of acceptable semvers to include a build tag.

* Fix action name by making into a string

* add tests to workflow

* Change python version to match Cloud.

* Pare down spark testing.

* Change branch reference of workflow to main.

---------

Co-authored-by: Mila Page <versusfacit@users.noreply.github.com>
  • Loading branch information
VersusFacit and VersusFacit authored Mar 27, 2024
1 parent 748c7f6 commit 9fe1a06
Show file tree
Hide file tree
Showing 2 changed files with 118 additions and 38 deletions.
153 changes: 116 additions & 37 deletions .github/workflows/release-internal.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,16 @@
name: Release internal patch
# What?
#
# Tag and release an arbitrary ref. Uploads to an internal archive for further processing.
#
# How?
#
# After checking out and testing the provided ref, the image is built and uploaded.
#
# When?
#
# Manual trigger.

name: "Release internal patch"

on:
workflow_dispatch:
Expand All @@ -7,58 +19,125 @@ on:
description: "The release version number (i.e. 1.0.0b1)"
type: string
required: true
sha:
description: "The sha to use (leave empty to use latest on main)"
type: string
required: false
package_test_command:
description: "Package test command"
type: string
default: "python -c \"import dbt.adapters.spark\""
required: true
dbms_name:
description: "The name of the warehouse the adapter connects to."
ref:
description: "The ref (sha or branch name) to use"
type: string
default: "spark"
default: "main"
required: true
workflow_call:
inputs:
version_number:
description: "The release version number (i.e. 1.0.0b1)"
type: string
required: true
sha:
description: "The sha to use (leave empty to use latest on main)"
type: string
required: false
package_test_command:
description: "Package test command"
type: string
default: "python -c \"import dbt.adapters.spark\""
required: true
dbms_name:
description: "The name of the warehouse the adapter connects to."
type: string
default: "spark"
required: true

defaults:
run:
shell: bash
shell: "bash"

env:
PYTHON_TARGET_VERSION: 3.11
PYTHON_TARGET_VERSION: 3.8

jobs:
run-unit-tests:
name: "Unit tests"

runs-on: ubuntu-latest
timeout-minutes: 10

steps:
- name: "Check out the repository"
uses: actions/checkout@v3

- name: "Set up Python ${{ env.PYTHON_TARGET_VERSION }}"
uses: actions/setup-python@v4
with:
python-version: "${{ env.PYTHON_TARGET_VERSION }}"

- name: Install python dependencies
run: |
sudo apt-get update
sudo apt-get install libsasl2-dev
python -m pip install --user --upgrade pip
python -m pip --version
python -m pip install -r requirements.txt
python -m pip install -r dev-requirements.txt
python -m pip install -e .
- name: Run unit tests
run: python -m pytest --color=yes --csv unit_results.csv -v tests/unit

run-integration-tests:
name: "${{ matrix.test }}"
needs: [run-unit-tests]
runs-on: ubuntu-latest

strategy:
fail-fast: false
matrix:
test:
- "apache_spark"
- "spark_session"
- "databricks_sql_endpoint"
- "databricks_cluster"
- "databricks_http_cluster"

env:
DBT_INVOCATION_ENV: github-actions
DD_CIVISIBILITY_AGENTLESS_ENABLED: true
DD_API_KEY: ${{ secrets.DATADOG_API_KEY }}
DD_SITE: datadoghq.com
DD_ENV: ci
DD_SERVICE: ${{ github.event.repository.name }}
DBT_DATABRICKS_CLUSTER_NAME: ${{ secrets.DBT_DATABRICKS_CLUSTER_NAME }}
DBT_DATABRICKS_HOST_NAME: ${{ secrets.DBT_DATABRICKS_HOST_NAME }}
DBT_DATABRICKS_ENDPOINT: ${{ secrets.DBT_DATABRICKS_ENDPOINT }}
DBT_DATABRICKS_TOKEN: ${{ secrets.DBT_DATABRICKS_TOKEN }}
DBT_DATABRICKS_USER: ${{ secrets.DBT_DATABRICKS_USERNAME }}
DBT_TEST_USER_1: "buildbot+dbt_test_user_1@dbtlabs.com"
DBT_TEST_USER_2: "buildbot+dbt_test_user_2@dbtlabs.com"
DBT_TEST_USER_3: "buildbot+dbt_test_user_3@dbtlabs.com"

steps:
- name: Check out the repository
if: github.event_name != 'pull_request_target'
uses: actions/checkout@v3
with:
persist-credentials: false

# explicitly checkout the branch for the PR,
# this is necessary for the `pull_request` event
- name: Check out the repository (PR)
if: github.event_name == 'pull_request_target'
uses: actions/checkout@v3
with:
persist-credentials: false
ref: ${{ github.event.pull_request.head.ref }}

# the python version used here is not what is used in the tests themselves
- name: Set up Python for dagger
uses: actions/setup-python@v4
with:
python-version: "3.11"

- name: Install python dependencies
run: |
python -m pip install --user --upgrade pip
python -m pip --version
python -m pip install -r dagger/requirements.txt
- name: "Run tests for ${{ matrix.test }}"
run: python dagger/run_dbt_spark_tests.py --profile ${{ matrix.test }}

invoke-reusable-workflow:
name: Build and Release Internally
name: "Build and Release Internally"
needs: [run-integration-tests]

uses: VersusFacit/dbt-release/.github/workflows/internal-archive-release.yml@main
uses: "dbt-labs/dbt-release/.github/workflows/internal-archive-release.yml@main"

with:
version_number: ${{ inputs.version_number }}
package_test_command: ${{ inputs.package_test_command }}
dbms_name: ${{ inputs.dbms_name }}
sha: ${{ inputs.sha }}
version_number: "${{ inputs.version_number }}"
package_test_command: "${{ inputs.package_test_command }}"
dbms_name: "spark"
ref: "${{ inputs.ref }}"

secrets: inherit
secrets: "inherit"
3 changes: 2 additions & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,8 @@ def _get_plugin_version_dict():
_version_path = os.path.join(this_directory, "dbt", "adapters", "spark", "__version__.py")
_semver = r"""(?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)"""
_pre = r"""((?P<prekind>a|b|rc)(?P<pre>\d+))?"""
_version_pattern = rf"""version\s*=\s*["']{_semver}{_pre}["']"""
_build = r"""(\+build[0-9]+)?"""
_version_pattern = rf"""version\s*=\s*["']{_semver}{_pre}{_build}["']"""
with open(_version_path) as f:
match = re.search(_version_pattern, f.read().strip())
if match is None:
Expand Down

0 comments on commit 9fe1a06

Please sign in to comment.