Uploading Files to AWS S3 with Github Actions
In this tutorials, I show you how to automatically upload files from your Github Actions workflow to your AWS S3 bucket. A common use case for this may be deploying code to a releases folder for public consumption. Let's say hypothetically I have desktop application. It gets compile down to an executable. There are three or four files required during installation and all files need to be downloaded. You setup a workflow using Github Actions so on every merge to master it deploys to AWS S3. Your users would always have access to your latest code. Hence the name continuous delivery.
I am making a few assumptions:
*You have an AWS S3 bucket setup and IAM User setup. You can setup both following [this tutorial](https://keithweaver.ca/lesson/setting-up-a-s3-bucket-with-cross-region-replication). *You have a Github account with access to Github Actions.
Github Repo Setup
Create your repository if you don't have one. You can head over to github.com/new.
You should be able to see the "Actions" tab on your repository.
Setting Up Your Workflow
Next, clone the repository to your local and open up your code base. If you do not have it, you will need to create a file at this path:
This will be your release branch. The contents of the file will look like this:
Let me explain the code above. First, the triggers are on pushes to the master branch. Next, we want to run on a Ubuntu container, and run a bash script. We will create this bash script in a second but first take a look at the arguments. We are passing a number of arguments to this script. We will setup our credentials, and bucket in our secrets section. If our code is public, we don't run the risk of exposing credentials to access our bucket. If your repository is private, it's not bad practice.
Next, create the release bash script at this location:
In this script, we grab the arguments, we zip our code and call a Python script to upload our file. It would be nice to just use the Amazon CLI for uploading but it's not installed by default. You could put together a curl call, but in my opinion a simple Python script is just as easy.
The other aspect of the code above is the zip command. Do not forget the dot in the centre of the command as it crucial for building the current directory. The
-x excludes whatever is listed behind it. Why list
.git/***? If you don't it will still include the empty
The last piece of code is the upload script. Create a file here:
This code is fairly simple and all values come from the arguments.
Just before you push the code up, you will need to add the secrets. This involves going to the
Settings of the repository. On the left, you will see "Secrets". Add the secrets:
*AWS_Bucket_Name *AWS_Access_Key *AWS_Access_Secret
If you have any questions on how to setup the S3 bucket or IAM role, you can follow my Setting up S3 bucket tutorial.
Push your code to the remote branch. It will run automatically and you should see the zip on S3.
If you download the folder, you can read the contents. The scripts folder was pushed up, but as you can see its empty. You can modify the exclude to ignore the folder as well.
Thanks for reading!