Setting Up This Blog
- 5 minutes read - 950 wordsI previously set up this blog with the awesome Hugo open source static-site generator. Last time I used GitLab Pages. I also recently setup a static site using GitHub and Netlify. However, I want to have another go at resurrecting this site, and use it to keep study notes for exams and personal projects. This time, I’ve decided to go all in on AWS.
Set up CodeCommit
First I created a new repository on CodeCommit. I then followed the instructions found here. This meant:
- Creating a new User and attaching the policy to grant the
AWSCodeCommitFullAccess
right permissions. - Run the
ssh-keygen
command to generate a public/private rsa key pair and upload the SSH public key to the User - Clone the new repository to work locally
Set up Hugo
I used the Ananke theme and followed the quick start guide. I chose to clone this theme into the themes directory, rather than creating a submodule. This was because I ran into issues later on when using a submodule within CodeCommit and running the build. Rather than spending time debugging, I chose the simplest option. This is something I need to check out and work out the best approach.
Set up S3 Buckets
Next I created the S3 buckets. I have one for the main site and another one for logs during the build pipeline. There are lots of tutorials that walk through setting up an S3 bucket for static web hosting. One obvious step was to disable the Amazon S3 block public access settings that enforce buckets don’t allow public access to data. This caught me out when looking to put new objects into the bucket during the build pipeline, and getting the following error:
An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
I also set up a Bucket policy to allow anyone to GetObject
on the bucket.
Set up Custom Domain Name and CloudFront Distribution
What I really want to do is server the website up through an Amazon CloudFront distribution. There are plenty of examples on the web of how to do this. The key steps for me where:
- Create a web distribution
- Set the Origin Domain Name to the static website hosting endpoint of the S3 bucket and NOT the value from the dropdown. It should be something like
{BUCKET_NAME}.s3-website.{REGION}.amazonaws.com
and not{BUCKET_NAME}.s3.amazonaws.com
- I set the Viewer Protocol Policy to redirect HTTP to HTTPS
- Set the Alternate Domain Name e.g. teachmyselfcloud.com in the Distribution Settings section
- Create an SSL certificate for the domain name in AWS Certificate Manager (ACM). ACM can automatically create the relevant record in Route53 as well which is useful.
- Select the custom SSL certificate above
- Set the
Default Root Object
asindex.html
- Have a cup of tea or coffee while AWS takes forever to provision the new distribution
- Configure a record set in Route53 with an alias for the domain name pointing to the CloudFront Distribution
Set up CodeBuild
Next it was time to set up a CodeBuild project. The project uses the CodeCommit repository. I chose the Ubuntu 14.04 base image managed by AWS CodeBuild. I also chose a new Service role, which is used so CodeBuild can interact with other AWS services. Once this service role was created, you need to add S3 permissions so that CodeBuild can write to S3. I used the following custom policy.
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:GetBucketLocation",
"s3:ListAllMyBuckets"
],
"Resource": "arn:aws:s3:::*"
},
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:List*",
"s3:Get*"
],
"Resource": [
"arn:aws:s3:::{BUCKET_NAME}",
"arn:aws:s3:::{BUCKET_NAME}/*"
]
}
]
}
Create Build Pipeline
By default, CodeBuild looks for a buildspec
file in the root directory of the CodeCommit repository. This defines the steps that happen after a commit is pushed to the repository. I created a new file called buildspec.yml
with the following details.
version: 0.2
env:
variables:
s3_output: "{BUCKET_NAME}"
hugo_version: "0.54.0"
phases:
install:
commands:
- wget "https://github.com/gohugoio/hugo/releases/download/v${hugo_version}/hugo_${hugo_version}_Linux-64bit.deb"
- sudo dpkg -i hugo_${hugo_version}_Linux-64bit.deb
finally:
- hugo version
build:
commands:
- hugo
- echo "S3 upload beginning"
- cd public
- aws s3 sync . s3://${s3_output} --delete --acl public-read
- echo "S3 upload ended"
finally:
- echo "Script finished running"
This is a very simple pipeline that downloads the hugo binary, builds the site, and then copies the files in the /public directory to the S3 bucket.
Set up CodePipeline
Setting up CodePipeline was straightforward. It involved selecting the master branch of the repository in CodeCommit as the source, and then selecting the CodeBuild project for the build phase. I skipped the deploy phase as it is not needed for this pipeline.
And there we have it, the new site is up and running with an automated pipeline.
Invalidate the cache
After setting up the website, I wanted to make some changes to the content. After pushing the changes, they still didn’t show up. This is because the CloudFront distribution I set up uses the default TTL of 86400 seconds or 24 hours. CloudFront distributes files to edge locations only when the files are requested, and not when you put new or updated files in the origin S3 bucket. By using the same file name, an edge location will only get the updated file when the old version in the cache expires, and there’s a new request for it. The first 1,000 invalidation paths you submit per month are free, but it costs $0.005 for each invalidation path above this.
To invalidate the cache from the command line, I used the following command to find the distribution ID of the web distribution:
aws cloudfront list-distributions
and then ran the following:
aws cloudfront create-invalidation --distribution-id $CDN_DISTRIBUTION_ID --paths "/*"
After refreshing the browser, I was now able to see the new content.