<-- home

Back on Ghost With Dokku

For the past year I have been hosting my blog on Github. I had been back and forth between ghost and some other options but landed on Github because I didn't feel like messing with hosting and such.

A few months back I discovered dokku and used it for my BBQ project. It was easy to setup because DigitalOcean has a one-click app for dokku on Ubuntu. I spun up a new machine, set DOKKU_HOST and started playing. I was stunned at how well dokku is put together. The command line is intuitive and fluent. It feels like heroku and has some really great integrations like letsencrypt, nginx, storage, and databases.

Ghost

I have been a long time fan of Ghost. In fact, I backed the project on kickstarter. But before version 1 there were things that felt clunky to me. The editor felt very beta. The overall design of the site was okay, but it still felt meh. Since the release of version 1 everything looks great. The editor is very similar to IA Writer in design. It feels inviting and comfortable.

Once I decided to try it out again I created a simple git repository with two files and pushed it to dokku. Because I have a global domain name of dokku.blockloop.io, Dokku automatically creates a new sub-domain for every application I add. For instance, when I dokku apps:create ghost it will activate ghost.dokku.blockloop.io. This allowed me to setup everything as though I would use it for real traffic.

Storage

One of the keys of hosting your own blog is keeping the content backed up. Ghost stores a lot of your personal information in the content folder and suggests mounting a host volume to the docker container, but mounting a volume to a DigitalOcean host risks me losing the information if the host is wiped out. This is where I created and attached a new volume to my droplet. From there I mounted the ghost docker container to the host droplet volume like this:

dokku storage:mount /mnt/volume-nyc1-01/ghostdata:/var/lib/ghost/content

Now my ghost content would be backed up to my volume. If the ghost container is updated or trashed then I don't lose my content.

Database

By default Ghost stores blog posts and other user meta using sqlite within the content folder. This would work just fine, but I could do better. Dokku has a plugin for mysql (the only relational database Ghost supports now). I enabled the plugin and linked it to my ghost app. I had to set some wonky environment variables because Ghost prefers separate database connection info rather than a single DSN (which is what is exposed when you link in dokku). Nonetheless, I was up and connected to mysql.

Dokku plugins are still docker containers so their data is equally as ephemeral. Luckily, database plugins have the ability to run backups to s3. I enabled nightly backups to Spaces which is s3 compatible.

# auth backups to DigitalOcean
dokku mysql:backup-auth ghost ACCESS_KEY_ID SECRET_ACCESS_KEY nyc3 s3v4 [url-to-space]
# run backup every day at 3am using CRON syntax. bucket can be whatever you want
dokku mysql:backup-schedule ghost '0 3 * * *' [bucket-name]

That's it! Now Ghost is running with a real database which is backed up nightly. My droplet has nightly backups as well as having a volume mounted for storage. I have only been running this setup for a few days so I don't have any major issues to discuss, but dokku hasn't given me any problems yet.

Clicky