We use Bitbucket for hosting our git repositories because  their Academic Plan offers unlimited private repos for an unlimited number of contributors. Another great benefit is 500 minutes of free build time for Pipelines. One of our sites is hosted in an S3 bucket and we wanted a way to have our repo files sync up whenever a commit gets made to the master branch. Pipelines seemed like the easiest method.

First, go to the repo you want to use and click “Pipelines” in the menu. If you haven’t set up a Pipeline  yet, there will be a sample bitbucket-pipelines.yml  file that you can change. Here’s the one we use:

image: cgswong/aws:aws

      - step:
            - aws s3 --region "us-west-2" sync dist/ s3://your.bucket.name

To run correctly, you will need to set the environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY in the repo’s Settings and update the script with your region, the directory you want to sync, and the name of your S3 bucket. Optionally, you can use the –delete flag to delete any files in the bucket that aren’t in your repo.

This configuration file uses an image from Docker Hub which has the AWS command line tools installed. It then runs the “aws s3” command using your environment variables for authentication.

Now, every time you commit to master, your S3 site will be up to date!