Lessons

Uploading Files to AWS S3 with Github Actions

In this tutorials, I show you how to automatically upload files from your Github Actions workflow to your AWS S3 bucket. A common use case for this may be deploying code to a releases folder for public consumption. Let's say hypothetically I have desktop application. It gets compile down to an executable. There are three or four files required during installation and all files need to be downloaded. You setup a workflow using Github Actions so on every merge to master it deploys to AWS S3. Your users would always have access to your latest code. Hence the name continuous delivery.

I am making a few assumptions:

*You have an AWS S3 bucket setup and IAM User setup. You can setup both following [this tutorial](https://keithweaver.ca/lesson/setting-up-a-s3-bucket-with-cross-region-replication). *You have a Github account with access to Github Actions.

Github Repo Setup

Create your repository if you don't have one. You can head over to github.com/new.

Screenshot of Github Workflow Running

You should be able to see the "Actions" tab on your repository.

Screenshot of Github Workflow Running

Setting Up Your Workflow

Next, clone the repository to your local and open up your code base. If you do not have it, you will need to create a file at this path:

1 .github/workflows/release.yml

This will be your release branch. The contents of the file will look like this:

1 2 3 4 5 6 7 8 9 10 11 12 name: ReleaseCI on: push: branches: - master jobs: build: runs-on: ubuntu-latest steps: - uses: actions/checkout@v1 - name: Release to S3 run: bash scripts/pipeline/release.sh ${{secrets.AWS_Bucket_Name}} "folder1/myapp.zip" ${{secrets.AWS_Access_Key}} ${{secrets.AWS_Access_Secret}} "myapp.zip"

Let me explain the code above. First, the triggers are on pushes to the master branch. Next, we want to run on a Ubuntu container, and run a bash script. We will create this bash script in a second but first take a look at the arguments. We are passing a number of arguments to this script. We will setup our credentials, and bucket in our secrets section. If our code is public, we don't run the risk of exposing credentials to access our bucket. If your repository is private, it's not bad practice.

Next, create the release bash script at this location:

1 scripts/pipeline/release.sh

In this script, we grab the arguments, we zip our code and call a Python script to upload our file. It would be nice to just use the Amazon CLI for uploading but it's not installed by default. You could put together a curl call, but in my opinion a simple Python script is just as easy.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 bucket_name=$1 aws_key=$2 aws_access_key=$3 aws_access_secret=$4 local_path=$5 # Remove any existing versions of a ZIP rm -rf $local_path # Create a zip of the current directory. zip -r $local_path . -x .git/ .git/*** .github/workflows/release.yml scripts/pipeline/release.sh scripts/pipeline/upload_file_to_s3.py .DS_Store # Install required dependencies for Python script. pip3 install boto3 # Run upload script python3 scripts/pipeline/upload_file_to_s3.py $bucket_name $aws_key $aws_access_key $aws_access_secret $local_path

The other aspect of the code above is the zip command. Do not forget the dot in the centre of the command as it crucial for building the current directory. The -x excludes whatever is listed behind it. Why list .git/ and .git/***? If you don't it will still include the empty .git folder.

The last piece of code is the upload script. Create a file here:

1 scripts/pipeline/upload_file_to_s3.py

This code is fairly simple and all values come from the arguments.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 import boto3 import sys def main(): if (len(sys.argv) 6): print ('Error: Required 5 arguments.') # Checks for 6 because the script path is in position 0. So len is 6 # for 5 arguments. sys.exit(1) bucket_name=sys.argv[1] aws_key=sys.argv[2] aws_access_key=sys.argv[3] aws_access_secret=sys.argv[4] local_path=sys.argv[5] session = boto3.Session( aws_access_key_id=aws_access_key, aws_secret_access_key=aws_access_secret, ) client = session.client('s3') response = client.upload_file( Filename=local_path, Bucket=bucket_name, Key=aws_key ) print ('Done uploading') main()

Screenshot of Github Workflow Running

Just before you push the code up, you will need to add the secrets. This involves going to the Settings of the repository. On the left, you will see "Secrets". Add the secrets:

*AWS_Bucket_Name *AWS_Access_Key *AWS_Access_Secret

If you have any questions on how to setup the S3 bucket or IAM role, you can follow my Setting up S3 bucket tutorial.

Screenshot of Github Workflow Running

Push your code to the remote branch. It will run automatically and you should see the zip on S3.

Screenshot of Github Workflow Running

Screenshot of Github Workflow Running

Screenshot of Github Workflow Running

Screenshot of Github Workflow Running

Screenshot of Github Workflow Running

If you download the folder, you can read the contents. The scripts folder was pushed up, but as you can see its empty. You can modify the exclude to ignore the folder as well.

Screenshot of Github Workflow Running

Screenshot of Github Workflow Running

Screenshot of Github Workflow Running

Thanks for reading!