Static Site Generation for my Personal Site
It is no secret that I love code. It is also no secret that I love to code. To me, there is an elegant simplicity to having everything as code, checking it in to source control, and having automated CI/CD processes do all of the work. Up until this point, I have been using Ansible and Terraform to control my internal and cloud based infrastructure, but hadn’t tackled my personal website. I had coded my old site by hand, but was looking for something different….something that included code and automation.
Here were my goals:
- Being able to concentrate on content and not code or frameworks.
- Being able to deploy static pages.
- Using AWS S3 Websites for hosting.
- Automate the generation and deployment when new content is checked in.
- All assets to be checked in to source control.
- The ability to create and publish content from anywhere.
- I routinely work between my personal laptop, a work laptop, and my iPad Pro.
- Something in Python.
- I realize that this requirement is arbitrary, but I wanted extreme portability and also wanted to force myself to keep learning the language.
Solution:
To be honest, I didn’t look around too much. I googled “python static site generator” and landed squarely on Pelican.
Pelican hit all of my marks, plus included a ton of Themes and Plugins to extend functionality
Getting Started
Getting going was incredibly easy. Here is what I did to get started:
mkdir mysite
cd mysite
echo "fabric" >> requirements.txt
echo "pelican" >> requirements.txt
echo "markdown" >> requirements.txt
echo "ghp-import" >> requirements.txt
echo "beautifulsoup4" >> requirements.txt
pip install -r requirements.txt
pelican-quickstart
Now all I needed was content! For me, I am doing this all in markdown, so it is as simple as adding some .md files to the articles directory and then running:
#Generate static html from markdown
pelican content
#Test locally
python -m pelican.server
…and then testing locally
Automation and Production
One of the great features of Pelican is its ability to modify the content to meet your production needs. You can modify some of the variables in your publishconf.py files to account for changing domains, etc.
Pelican also includes the ability to deploy and modify through Fabric. I do not use those features, but check them out here
My automation is done through Travis-CI. For every commit and push, I:
- use a pre-built python image.
- clone my repository recursively (in order to get submodules).
- install python modules from requirements.
- generate the Pelican Content using my publishconf.py.
- Deploy to S3 and set the ACLs to Public so the website will work.
Here a breakdown of my travis file:
language: python
branches:
only:
- master
git:
submodules: true
before_install:
install:
- pip install -r requirements.txt
script:
- pelican content -s publishconf.py
notifications:
slack:
secure: <ENCRYPTED VALUE HERE>
deploy:
provider: s3
access_key_id:
secure: <ENCRYPTED VALUE HERE>
secret_access_key:
<ENCRYPTED VALUE HERE>
bucket: tjgreco.com
keep-history: true
skip_cleanup: true
local-dir: output
acl: public_read
Please be sure to encrypt all sensitive information in your travis file using these techniques
The key points here in order to make this work with S3 publishing are:
- acl: public_read – sets the objects to public read so that they will work with S3 websites.
- skip_cleanup: true – Needed to keep the build artifacts from the build stage (the content in the “output” folder). If you don’t include this, Travis will remove the build artifacts after the build stage.
- local-dir: output - only copy over the output folder.
AWS Config The AWS setup is simple. All you need is:
- A bucket set up for public website hosting.
- An AWS programmatic user with the proper permissions to be able to write objects to your bucket and set ACLs.
- I also decided to lock this user down to just this bucket to protect the rest of my assets.
Example AWS Policy here:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "ListObjectsInBucket",
"Effect": "Allow",
"Action": [
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::<bucket_name>"
]
},
{
"Sid": "AllObjectActions",
"Effect": "Allow",
"Action": [
"s3:*Object",
"s3:PutObjectAcl"
],
"Resource": [
"arn:aws:s3:::<bucket_name>/*"
]
}
]
}
Links Here is the link to all of the code and artifacts that run this site. Hoplefully you find it useful.