The one disadvantage of a static-site generator like Hugo is that after you create an entry and build your blog locally, you then have to figure out how to update your site on the server. And Hugo seems to change a whole lot of files with every run of the command hugo, even if you are only creating a single entry.

There’s a heavy amount of encouragement to use GitHub to manage the blog and some kind of build utility to transfer the files to the web server.

While I’m using Git (and GitHub and also Gitea-based Codeberg) for some projects, I don’t want to use version control on my blogs. For now anyway.

So how do I get the files from my Hugo /public directory to the /public directory on my web host? I’m using shared hosting now, but this would be the same for a VPS like DigitalOcean, Google Cloud, Amazon Web Services, etc.

For the past many months, I’ve been using sftp with a password. Not all hosts allow password access. Some insist on ssh with public-private keys. My host (NearlyFreeSpeech.net) allows passwords, and that’s what I’ve been doing until now:

I use the FileZilla FTP client and transfer files from laptop to server. That works, but it’s slow and a bit cumbersome.

My local operating system is Debian Linux (the current stable release, Buster). The method I describe below will work on pretty much any Linux, BSD or MacOS system that has rsync installed. I can’t remember if MacOS ships with rsync or not. On Debian, you generally have to use the package manager to add it.

I’ve used rsync in the past, mostly for backups over USB-connected hard drives, but sparingly for network file transfers.

I decided to set up an rsync script to do the transfer. I’m going from a local /public directory to a remote public directory, though the names of the directories don’t really matter.

The first thing I did was set up ssh on my shared host. They all do it differently, so find out from your host’s documentation how it’s done. At NearlyFreeSpeech.net, it’s easy to upload your ssh public key, and you’re all set for ssh, rsync and sftp without a password.

With ssh set up, here is roughly what my rsync command looks like:

rsync -avzh --delete ~/blog/public/ login_name@web_host_url.com:/home/public/

Remember to replace login_name@web_host_url.com with the login and domain your host tells you to use for ssh, replace ~/blog/public/ with the path to your local content and replace /home/public/ with the path to your remote content.

Also: in rsync, the final / is important. It means, “copy everything that’s in this directory but don’t copy the directory itself.”

Before you actually run this command for the first time, it’s helpful to do two things:

  • Back up any content on the server that you don’t want to lose
  • add the --dry-run switch (after --delete) to your rsync line the first time and see what the system tells you is happening. If it looks good, remove --dry-run and give it a go for real.

Once I was satisfied with my rsync line, I put it in a script, made it executable with chmod a+x and used sudo to stash it in /usr/local/bin:

#!bin/sh

rsync -avzh --delete ~/blog/public/ login_name@web_host_url.com:/home/public/

exit 0

I should combine these two things together and make a script that does the hugo build and the rsync operation in one go:

#!bin/sh

/path/to/my/hugo/directory/hugo

rsync -avzh --delete ~/blog/public/ login_name@web_host_url.com:/home/public/

exit 0