Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor Artifacts at rest for safe/easy deployment #16187

Merged
merged 42 commits into from
Feb 23, 2021
Merged
Show file tree
Hide file tree
Changes from 32 commits
Commits
Show all changes
42 commits
Select commit Hold shift + click to select a range
056a8ef
update build_packages to output to a deeper subfolder
scbedd Dec 9, 2020
7f5632c
Merge remote-tracking branch 'upstream/master' into feature/refactor-…
scbedd Jan 13, 2021
d999244
extend whl search to recursive in preparation for updated artifacts
scbedd Jan 13, 2021
90d8d27
progress before stopping for the night
scbedd Jan 13, 2021
3ab9a0c
extended tox_helper_tasks implementation of find_whl to be friendly t…
scbedd Jan 13, 2021
61ced2a
update find_whl in common_tasks to do the same logic that our previou…
scbedd Jan 15, 2021
05ecef7
use the correct variable name
scbedd Jan 15, 2021
10d098a
where is my brain? have i left in in the garbage can? fixing another …
scbedd Jan 15, 2021
3f14f77
...
scbedd Jan 15, 2021
1ef5486
paths -> whls
scbedd Jan 15, 2021
575dae5
Merge remote-tracking branch 'upstream/master' into feature/refactor-…
scbedd Jan 25, 2021
b36cdb0
Merge remote-tracking branch 'upstream/master' into feature/refactor-…
scbedd Jan 29, 2021
75a7e6d
find_sdist is updated the same as find_whl (tox_helper_tasks). update…
scbedd Jan 29, 2021
67ae5d9
update build_packages to write the appropriate package folder, clarif…
scbedd Jan 29, 2021
94c8318
artifacts building in proper location now
scbedd Jan 29, 2021
d129327
docs output to the correct folder now
scbedd Jan 29, 2021
9324e35
repair build-artifacts so that the checks can actually queue
scbedd Jan 29, 2021
fef4bae
remove Artifacts from argument list and at calltime from build-artif…
scbedd Jan 29, 2021
038e37f
download the correct artifact
scbedd Jan 29, 2021
6ed6a31
update the release to take advantage of the new structure
scbedd Jan 29, 2021
8695385
resolving the issues with the dev_requirement replacements not being …
scbedd Jan 29, 2021
33dc627
repair analyze step. need to download the correct artifact
scbedd Jan 29, 2021
d1dfa54
remove duplicate msrest requirement
scbedd Jan 30, 2021
561a8a1
Revert "remove duplicate msrest requirement"
scbedd Jan 30, 2021
0dfc1a4
replaced globbing with os.walk
scbedd Feb 2, 2021
8fbc9f5
break all the existing PRs so that I can debug this when I get home
scbedd Feb 2, 2021
a392170
merge upstream
scbedd Feb 2, 2021
302bc89
resolving find_whl in tox_helper_tasks
scbedd Feb 2, 2021
9b7f914
undo ci.yml change for template. make it exactly the same as what we'…
scbedd Feb 2, 2021
f55e5f5
Merge remote-tracking branch 'upstream/master' into feature/refactor-…
scbedd Feb 3, 2021
4e59ded
Merge branch 'master' into feature/refactor-artifacts
scbedd Feb 17, 2021
51d3520
need include recursion, given that artifacts are no longer _right there_
scbedd Feb 18, 2021
293a9c3
cleanup typo
scbedd Feb 22, 2021
2f40f0b
regression needs to download the right artifact
scbedd Feb 22, 2021
e1a270d
Merge remote-tracking branch 'upstream/master' into feature/refactor-…
scbedd Feb 22, 2021
0bda80a
resolve some core regression failures. does the min/latest still inst…
scbedd Feb 22, 2021
56b0c0e
move installation of the package we need to test _after_ the installa…
scbedd Feb 22, 2021
fa87c04
ensure the proper version of the package is installed, even if it's c…
scbedd Feb 22, 2021
5f1ea2a
add debug commit to figure out what is going on with the create pull …
scbedd Feb 23, 2021
2e20b70
restore create-pull-request to master
scbedd Feb 23, 2021
93c47b0
no reason to have that 'docs' in the documentation artifact when it's…
scbedd Feb 23, 2021
9f72866
remove pdb reference
scbedd Feb 23, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion build_package.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,8 @@
import os
import glob
import sys
from subprocess import check_call

from subprocess import check_call

DEFAULT_DEST_FOLDER = "./dist"

Expand Down
2 changes: 1 addition & 1 deletion eng/pipelines/templates/jobs/archetype-sdk-client.yml
Original file line number Diff line number Diff line change
Expand Up @@ -210,7 +210,7 @@ jobs:
BeforeTestSteps:
- task: DownloadPipelineArtifact@0
inputs:
artifactName: 'artifacts'
artifactName: 'packages'
targetPath: $(Build.ArtifactStagingDirectory)

- template: ../steps/set-dev-build.yml
Expand Down
86 changes: 42 additions & 44 deletions eng/pipelines/templates/stages/archetype-python-release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@ stages:
deploy:
steps:
- checkout: self

- ${{if eq(parameters.TestPipeline, 'true')}}:
- task: PowerShell@2
displayName: Prep template pipeline for release
Expand All @@ -40,24 +41,26 @@ stages:
workingDirectory: $(Build.SourcesDirectory)
filePath: eng/scripts/SetTestPipelineVersion.ps1
arguments: '-BuildID $(Build.BuildId)'

- ${{if ne(artifact.skipVerifyChangeLog, 'true')}}:
- template: /eng/common/pipelines/templates/steps/verify-changelog.yml
parameters:
PackageName: ${{artifact.name}}
ServiceName: ${{parameters.ServiceDirectory}}
ForRelease: true
- template: /eng/pipelines/templates/steps/stage-filtered-artifacts.yml
parameters:
SourceFolder: ${{parameters.ArtifactName}}
TargetFolder: ${{artifact.safeName}}
PackageName: ${{artifact.name}}

- pwsh: |
Get-ChildItem -Recurse $(Pipeline.Workspace)/${{artifact.safeName}}
$packageDirectory = "${{artifact.name}}".Replace("_", "-")
scbedd marked this conversation as resolved.
Show resolved Hide resolved
echo "##vso[task.setvariable variable=Package.Name]$packageDirectory"

- pwsh: |
Get-ChildItem -Recurse $(Pipeline.Workspace)/${{parameters.ArtifactName}}/$(Package.Name)
workingDirectory: $(Pipeline.Workspace)
displayName: Output Visible Artifacts

- template: /eng/common/pipelines/templates/steps/create-tags-and-git-release.yml
parameters:
ArtifactLocation: $(Pipeline.Workspace)/${{artifact.safeName}}
ArtifactLocation: $(Pipeline.Workspace)/${{parameters.ArtifactName}}/$(Package.Name))
PackageRepository: PyPI
ReleaseSha: $(Build.SourceVersion)
RepoId: Azure/azure-sdk-for-python
Expand All @@ -83,19 +86,17 @@ stages:
artifact: ${{parameters.ArtifactName}}
timeoutInMinutes: 5

- template: /eng/pipelines/templates/steps/stage-filtered-artifacts.yml
parameters:
SourceFolder: ${{parameters.ArtifactName}}
TargetFolder: ${{artifact.safeName}}
PackageName: ${{artifact.name}}

- task: UsePythonVersion@0

- script: |
set -e
pip install twine readme-renderer[md]
displayName: Install Twine

- pwsh: |
$packageDirectory = "${{artifact.name}}".Replace("_", "-")
echo "##vso[task.setvariable variable=Package.Name]$packageDirectory"

- task: TwineAuthenticate@1
displayName: 'Authenticate to registry: pypi.org'
inputs:
Expand All @@ -108,17 +109,17 @@ stages:

- script: |
set -e
twine upload --repository 'pypi' --config-file $(PYPIRC_PATH) $(Pipeline.Workspace)/${{artifact.safeName}}/*.whl
twine upload --repository 'pypi' --config-file $(PYPIRC_PATH) $(Pipeline.Workspace)/${{parameters.ArtifactName}}/$(Package.Name)/*.whl
echo "Uploaded whl to pypi"
twine upload --repository 'pypi' --config-file $(PYPIRC_PATH) $(Pipeline.Workspace)/${{artifact.safeName}}/*.zip
twine upload --repository 'pypi' --config-file $(PYPIRC_PATH) $(Pipeline.Workspace)/${{parameters.ArtifactName}}/$(Package.Name)/*.zip
echo "Uploaded zip to pypi"
displayName: 'Publish package to registry: pypi.org'

- script: |
set -e
twine upload --repository ${{parameters.DevFeedName}} --config-file $(PYPIRC_PATH) $(Pipeline.Workspace)/${{artifact.safeName}}/*.whl
twine upload --repository ${{parameters.DevFeedName}} --config-file $(PYPIRC_PATH) $(Pipeline.Workspace)/${{parameters.ArtifactName}}/$(Package.Name)/*.whl
echo "Uploaded whl to devops feed"
twine upload --repository ${{parameters.DevFeedName}} --config-file $(PYPIRC_PATH) $(Pipeline.Workspace)/${{artifact.safeName}}/*.zip
twine upload --repository ${{parameters.DevFeedName}} --config-file $(PYPIRC_PATH) $(Pipeline.Workspace)/${{parameters.ArtifactName}}/$(Package.Name)/*.zip
echo "Uploaded sdist to devops feed"
displayName: 'Publish package to feed: ${{parameters.DevFeedName}}'

Expand All @@ -138,23 +139,23 @@ stages:
deploy:
steps:
- checkout: self
- template: /eng/pipelines/templates/steps/stage-filtered-artifacts.yml
parameters:
SourceFolder: ${{parameters.DocArtifact}}
TargetFolder: ${{artifact.safeName}}
PackageName: ${{artifact.name}}
AdditionalRegex: '.zip'

- pwsh: |
$packageDirectory = "${{artifact.name}}".Replace("_", "-")
echo "##vso[task.setvariable variable=Package.Name]$packageDirectory"

- pwsh: |
Get-ChildItem -Recurse $(Pipeline.Workspace)/${{artifact.safeName}}
Get-ChildItem -Recurse $(Pipeline.Workspace)/${{parameters.DocArtifact}}/$(Package.Name)
workingDirectory: $(Pipeline.Workspace)
displayName: Output Visible Artifacts

- template: /eng/common/pipelines/templates/steps/publish-blobs.yml
parameters:
FolderForUpload: '$(Pipeline.Workspace)/${{artifact.safeName}}'
FolderForUpload: '$(Pipeline.Workspace)/${{parameters.DocArtifact}}/$(Package.Name)'
BlobSASKey: '$(azure-sdk-docs-prod-sas)'
BlobName: '$(azure-sdk-docs-prod-blob-name)'
TargetLanguage: 'python'
ArtifactLocation: '$(Pipeline.Workspace)/${{parameters.ArtifactName}}'
ArtifactLocation: '$(Pipeline.Workspace)/${{parameters.ArtifactName}}/$(Package.Name)'
# we override the regular script path because we have cloned the build tools repo as a separate artifact.
ScriptPath: 'eng/common/scripts/copy-docs-to-blobstorage.ps1'

Expand All @@ -177,22 +178,24 @@ stages:
deploy:
steps:
- checkout: self
- template: /eng/pipelines/templates/steps/stage-filtered-artifacts.yml
parameters:
SourceFolder: ${{parameters.ArtifactName}}
TargetFolder: ${{artifact.safeName}}
PackageName: ${{artifact.name}}

- pwsh: |
$packageDirectory = "${{artifact.name}}".Replace("_", "-")
echo "##vso[task.setvariable variable=Package.Name]$packageDirectory"

- pwsh: |
Get-ChildItem -Recurse $(Pipeline.Workspace)/${{artifact.safeName}}
Get-ChildItem -Recurse $(Pipeline.Workspace)/${{parameters.ArtifactName}}/$(Package.Name)
workingDirectory: $(Pipeline.Workspace)
displayName: Output Visible Artifacts

- template: /eng/common/pipelines/templates/steps/get-pr-owners.yml
parameters:
TargetVariable: "OwningGHUser"
ServiceDirectory: ${{parameters.ServiceDirectory}}

- template: /eng/common/pipelines/templates/steps/docs-metadata-release.yml
parameters:
ArtifactLocation: $(Pipeline.Workspace)/${{artifact.safeName}}
ArtifactLocation: $(Pipeline.Workspace)/${{parameters.ArtifactName}}/$(Package.Name)
PackageRepository: PyPI
ReleaseSha: $(Build.SourceVersion)
RepoId: Azure/azure-sdk-for-python
Expand Down Expand Up @@ -280,22 +283,17 @@ stages:
- ${{ each artifact in parameters.Artifacts }}:
- ${{if ne(artifact.skipPublishDevFeed, 'true')}}:
- pwsh: |
Get-ChildItem $(Pipeline.Workspace)/${{parameters.ArtifactName}}
New-Item -Type Directory -Name ${{artifact.safeName}} -Path $(Pipeline.Workspace)
$underscorePrefix = "${{artifact.name}}"
$dashPrefix = "${{artifact.name}}".Replace("_", "-")
Copy-Item $(Pipeline.Workspace)/${{parameters.ArtifactName}}/$dashPrefix-[0-9]*.[0-9]*.[0-9]*a[0-9]* $(Pipeline.Workspace)/${{artifact.safeName}}
Copy-Item $(Pipeline.Workspace)/${{parameters.ArtifactName}}/$underscorePrefix-[0-9]*.[0-9]*.[0-9]*a[0-9]* $(Pipeline.Workspace)/${{artifact.safeName}}
Get-ChildItem $(Pipeline.Workspace)/${{artifact.safeName}}

$fileCount = (Get-ChildItem $(Pipeline.Workspace)/${{artifact.safeName}} | Measure-Object).Count
$packageDirectory = "${{artifact.name}}".Replace("_", "-")
echo "##vso[task.setvariable variable=Package.Name]$packageDirectory"
- pwsh: |
$fileCount = (Get-ChildItem $(Pipeline.Workspace)/${{parameters.ArtifactName}}/$packageDirectory | Measure-Object).Count
if ($fileCount -eq 0) {
Write-Host "No alpha packages for ${{artifact.safeName}} to publish."
exit 0
}

twine upload --repository $(DevFeedName) --config-file $(PYPIRC_PATH) $(Pipeline.Workspace)/${{artifact.safeName}}/*a*.whl
twine upload --repository $(DevFeedName) --config-file $(PYPIRC_PATH) $(Pipeline.Workspace)/${{parameters.ArtifactName}}/$(Package.Name)/*a*.whl
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not really what this change is about, but I wonder if this wildcard is too permissive.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What makes you say that?

echo "Uploaded whl to devops feed $(DevFeedName)"
twine upload --repository $(DevFeedName) --config-file $(PYPIRC_PATH) $(Pipeline.Workspace)/${{artifact.safeName}}/*a*.zip
twine upload --repository $(DevFeedName) --config-file $(PYPIRC_PATH) $(Pipeline.Workspace)/${{parameters.ArtifactName}}/$(Package.Name)/*a*.zip
echo "Uploaded sdist to devops feed $(DevFeedName)"
displayName: 'Publish ${{artifact.name}} alpha package'
2 changes: 1 addition & 1 deletion eng/pipelines/templates/steps/analyze.yml
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ steps:
- task: DownloadPipelineArtifact@0
condition: and(succeededOrFailed(), ne(variables['Skip.ApiStubGen'],'true'))
inputs:
artifactName: 'artifacts'
artifactName: 'packages'
targetPath: $(Build.ArtifactStagingDirectory)

- template: ../steps/run_apistub.yml
Expand Down
34 changes: 18 additions & 16 deletions eng/pipelines/templates/steps/build-artifacts.yml
Original file line number Diff line number Diff line change
@@ -1,9 +1,19 @@
parameters:
BeforePublishSteps: []
TestPipeline: false
BuildTargetingString: 'azure-*'
ServiceDirectory: ''
BuildDocs: true
- name: BeforePublishSteps
type: object
default: []
- name: TestPipeline
type: boolean
default: false
- name: BuildTargetingString
type: string
default: 'azure-*'
- name: ServiceDirectory
type: string
default: ''
- name: BuildDocs
type: boolean
default: true

steps:
- ${{if eq(parameters.TestPipeline, 'true')}}:
Expand Down Expand Up @@ -57,8 +67,8 @@ steps:
arguments: '-d "$(Build.ArtifactStagingDirectory)" "${{ parameters.BuildTargetingString }}" --service=${{parameters.ServiceDirectory}} --devbuild="$(SetDevVersion)"'

- script: |
twine check $(Build.ArtifactStagingDirectory)/*.whl
twine check $(Build.ArtifactStagingDirectory)/*.zip
twine check $(Build.ArtifactStagingDirectory)/**/*.whl
twine check $(Build.ArtifactStagingDirectory)/**/*.zip
displayName: 'Verify Readme'

- task: PythonScript@0
Expand All @@ -73,17 +83,9 @@ steps:

- ${{ parameters.BeforePublishSteps }}

- task: PublishPipelineArtifact@0
inputs:
artifactName: 'artifacts'
targetPath: $(Build.ArtifactStagingDirectory)

# Duplicating the task above to introduce a packages artifact for consistency
# with the other pipelines. Also using the newer YAML shortcut. Once we get
# past release successfully with unified pipelines we'll look at getting rid
# of the duplicated "artifacts" artifact.
- publish: $(Build.ArtifactStagingDirectory)
artifact: packages
condition: succeededOrFailed()

- task: PublishBuildArtifacts@1
condition: and(succeededOrFailed(), ${{parameters.BuildDocs}})
Expand Down
2 changes: 1 addition & 1 deletion eng/tox/run_sphinx_build.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ def move_output_and_zip(target_dir, package_dir, package_name):
if not os.path.exists(ci_doc_dir):
os.mkdir(ci_doc_dir)

individual_zip_location = os.path.join(ci_doc_dir, package_name)
individual_zip_location = os.path.join(ci_doc_dir, package_name, package_name + "-docs")
shutil.make_archive(individual_zip_location, 'zip', target_dir)

def sphinx_build(target_dir, output_dir):
Expand Down
22 changes: 17 additions & 5 deletions eng/tox/tox_helper_tasks.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,8 @@
import io
import glob
import zipfile
import pdb
import fnmatch

logging.getLogger().setLevel(logging.INFO)

Expand Down Expand Up @@ -92,7 +94,13 @@ def find_sdist(dist_dir, pkg_name, pkg_version):
return

pkg_name_format = "{0}-{1}.zip".format(pkg_name, pkg_version)
packages = [os.path.basename(w) for w in glob.glob(os.path.join(dist_dir, pkg_name_format))]
packages = []
for root, dirnames, filenames in os.walk(dist_dir):
for filename in fnmatch.filter(filenames, pkg_name_format):
packages.append(os.path.join(root, filename))

packages = [os.path.relpath(w, dist_dir) for w in packages]

if not packages:
logging.error("No sdist is found in directory %s with package name format %s", dist_dir, pkg_name_format)
return
Expand All @@ -109,8 +117,15 @@ def find_whl(whl_dir, pkg_name, pkg_version):
logging.error("Package name cannot be empty to find whl")
return


pkg_name_format = "{0}-{1}-*.whl".format(pkg_name.replace("-", "_"), pkg_version)
whls = [os.path.basename(w) for w in glob.glob(os.path.join(whl_dir, pkg_name_format))]
whls = []
for root, dirnames, filenames in os.walk(whl_dir):
for filename in fnmatch.filter(filenames, pkg_name_format):
whls.append(os.path.join(root, filename))

whls = [os.path.relpath(w, whl_dir) for w in whls]

if not whls:
logging.error("No whl is found in directory %s with package name format %s", whl_dir, pkg_name_format)
logging.info("List of whls in directory: %s", glob.glob(os.path.join(whl_dir, "*.whl")))
Expand All @@ -136,6 +151,3 @@ def find_whl(whl_dir, pkg_name, pkg_version):
return whls[0]
else:
return None



4 changes: 2 additions & 2 deletions scripts/devops_tasks/build_packages.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@
def build_packages(targeted_packages, distribution_directory, is_dev_build=False):
# run the build and distribution
for package_root in targeted_packages:
print(package_root)
service_hierarchy = os.path.join(os.path.basename(package_root))
if is_dev_build:
verify_update_package_requirement(package_root)
print("Generating Package Using Python {}".format(sys.version))
Expand All @@ -34,7 +34,7 @@ def build_packages(targeted_packages, distribution_directory, is_dev_build=False
sys.executable,
build_packing_script_location,
"--dest",
distribution_directory,
os.path.join(distribution_directory, service_hierarchy),
package_root,
],
root_dir,
Expand Down
16 changes: 11 additions & 5 deletions scripts/devops_tasks/common_tasks.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
import textwrap
import io
import re
import pdb
import fnmatch

# Assumes the presence of setuptools
from pkg_resources import parse_version, parse_requirements, Requirement, WorkingSet, working_set
Expand Down Expand Up @@ -356,17 +356,23 @@ def find_whl(package_name, version, whl_directory):
parsed_version = parse(version)

logging.info("Searching whl for package {0}-{1}".format(package_name, parsed_version.base_version))
whl_name = "{0}-{1}*.whl".format(package_name.replace("-", "_"), parsed_version.base_version)
paths = glob.glob(os.path.join(whl_directory, whl_name))
if not paths:
whl_name_format = "{0}-{1}-*.whl".format(package_name.replace("-", "_"), parsed_version.base_version)
whls = []
for root, dirnames, filenames in os.walk(whl_directory):
for filename in fnmatch.filter(filenames, whl_name_format):
whls.append(os.path.join(root, filename))

whls = [os.path.relpath(w, whl_directory) for w in whls]

if not whls:
logging.error(
"whl is not found in whl directory {0} for package {1}-{2}".format(
whl_directory, package_name, parsed_version.base_version
)
)
exit(1)

return paths[0]
return whls[0]

# This method installs package from a pre-built whl
def install_package_from_whl(
Expand Down
3 changes: 2 additions & 1 deletion scripts/devops_tasks/tox_harness.py
Original file line number Diff line number Diff line change
Expand Up @@ -243,7 +243,7 @@ def build_whl_for_req(req, package_path):
logging.info("Building wheel for package {}".format(pkg_name))
run_check_call([sys.executable, "setup.py", "bdist_wheel", "-d", temp_dir], req_pkg_path)

whl_path = find_whl(pkg_name, version, temp_dir)
whl_path = os.path.join(temp_dir, find_whl(pkg_name, version, temp_dir))
logging.info("Wheel for package {0} is {1}".format(pkg_name, whl_path))
logging.info("Replacing dev requirement. Old requirement:{0}, New requirement:{1}".format(req, whl_path))
return whl_path
Expand All @@ -265,6 +265,7 @@ def replace_dev_reqs(file, pkg_root):

req_file_name = os.path.basename(file)
logging.info("Old {0}:{1}".format(req_file_name, adjusted_req_lines))

adjusted_req_lines = list(map(lambda x: build_whl_for_req(x, pkg_root), adjusted_req_lines))
logging.info("New {0}:{1}".format(req_file_name, adjusted_req_lines))

Expand Down