Skip to content

🔄 GitHub Action to sync a directory with a remote S3 bucket 🧺

License

Notifications You must be signed in to change notification settings

ngti/s3-sync-action

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

43 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GitHub Action to Sync S3 Bucket 🔄

This simple action uses the vanilla AWS CLI to sync a directory (either from your repository or generated during your workflow) with a remote S3 bucket.

Usage

workflow.yml Example

Place in a .yml file such as this one in your .github/workflows folder. Refer to the documentation on workflow YAML syntax here.

As of v0.3.0, all aws s3 sync flags are optional to allow for maximum customizability (that's a word, I promise) and must be provided by you via args:.

The following example includes optimal defaults for a public static website:

  • --acl public-read makes your files publicly readable (make sure your bucket settings are also set to public).
  • --follow-symlinks won't hurt and fixes some weird symbolic link problems that may come up.
  • Most importantly, --delete permanently deletes files in the S3 bucket that are not present in the latest version of your repository/build.
  • Optional tip: If you're uploading the root of your repository, adding --exclude '.git/*' prevents your .git folder from syncing, which would expose your source code history if your project is closed-source. (To exclude more than one pattern, you must have one --exclude flag per exclusion. The single quotes are also important!)
name: Upload Website

on:
  push:
    branches:
    - master

jobs:
  deploy:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@master
    - uses: jakejarvis/s3-sync-action@master
      with:
        args: --acl public-read --follow-symlinks --delete
      env:
        AWS_S3_BUCKET: ${{ secrets.AWS_S3_BUCKET }}
        AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
        AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
        AWS_REGION: 'us-west-1'   # optional: defaults to us-east-1
        SOURCE_DIR: 'public'      # optional: defaults to entire repository

Configuration

The following settings must be passed as environment variables as shown in the example. Sensitive information, especially AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY, should be set as encrypted secrets — otherwise, they'll be public to anyone browsing your repository's source code and CI logs.

Key Value Suggested Type Required Default
AWS_ACCESS_KEY_ID Your AWS Access Key. More info here. secret env Yes N/A
AWS_SECRET_ACCESS_KEY Your AWS Secret Access Key. More info here. secret env Yes N/A
AWS_S3_BUCKET The name of the bucket you're syncing to. For example, jarv.is or my-app-releases. secret env Yes N/A
AWS_REGION The region where you created your bucket. Set to us-east-1 by default. Full list of regions here. env No us-east-1
AWS_S3_ENDPOINT The endpoint URL of the bucket you're syncing to. Can be used for VPC scenarios or for non-AWS services using the S3 API, like DigitalOcean Spaces. env No Automatic (s3.amazonaws.com or AWS's region-specific equivalent)
SOURCE_DIR The local directory (or file) you wish to sync/upload to S3. For example, public. Defaults to your entire repository. env No ./ (root of cloned repository)
DEST_DIR The directory inside of the S3 bucket you wish to sync/upload to. For example, my_project/assets. Defaults to the root of the bucket. env No / (root of bucket)
FIX_HTML Enables the action to copy files with .html extention to the same without the extention, with setting the correct content type in the S3 bucket env No N/A
AWS_CF_ID CloudFront ID. if a cloudfront id is provided, invalidate the cache so new content is served inmediately env No N/A

License

This project is distributed under the MIT license.

About

🔄 GitHub Action to sync a directory with a remote S3 bucket 🧺

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Shell 76.8%
  • Dockerfile 23.2%