Moving the blog

Since these notes are rather miscellaneous jots than real rants, a name change was in place:

$ cli53 rrdelete rants CNAME
$ cli53 rrcreate jots CNAME (..)

Also, after a combination of brew upgrade and an XCode tools update (or both), ruhroh was no longer working, and RVM would not build properly for a reinstall. Even though ruhroh was fine and produced a nice theme out of the box, this was a good opportunity to try out some of the other static blog generators out there.

After a short search on Hacker News, Pelican looked like a nice candidate. It's written in Python, supports both Markdown and RST and is actively maintained. It's also well documented (as usual with a Python-based project).

Installation is as easy as (provided you have virtualenvwrapper installed):

$ mkproject new-blog
$ pip install pelican markdown
$ pelican-quickstart

pelican-quickstart will generate a sample configuration and a Makefile with shortcuts. It includes targets to deploy to FTP, SSH, Github and Dropbox, but for some reason not S3. This is trivial to add, though:

s3_upload: publish
    s3cmd sync --delete-removed --verbose -rP $(OUTPUTDIR)/ s3://$(S3_BUCKET)/

The default theme looks good enough, but there are several other to choose from if you don't want to create one from scratch. There's also a handy tool pelican-themes to manage them.

After moving old content and finishing this writeup, I synced the contents of output to a new S3 bucket set up for static hosting. I also set up access logging on the bucket, and auto-expire of old logs in the log bucket:

  • From the S3 console, enable access logging on the bucket hosting the blog, chose another (existing) bucket for where log files will be placed. You might also choose to add a target prefix to the logs to send access logs to a separate directory, in case several web-buckets log to the same log-bucket.
  • In the properties of the log bucket, add a rule under Lifecycle to expire objects from the log directory you set up. Give it a name (like expire-logs), set the rule to either apply to the whole bucket or to the target prefix specified before. Add an Expiration-action to delete objects older than n days.

Finally, a make s3_upload will generate the blog and upload to S3. Some people use Fabric for this, but a Makefile is more than enough.

As I don't host the pages through Github, there's not really any need for me to push the blog repo there. To make a backup on Dropbox, simply init a bare repository in the Dropbox folder and add that as a remote:

$ mkdir -p ~/Dropbox/repos/blog.git
$ ( cd ~/Dropbox/repos/blog.git/; git init --bare )
$ git remote add dropbox ~/Dropbox/repos/blog.git/
$ git push dropbox master also supports git and is a good option for offsite storage if you already have an account with them.