The next step in putting a new hard drive into my laptop is to make a full backup of the
/home files. I haven’t done a full-image Clonezilla-style backup in a long, long time, and I do recommend it it you have a spare drive and a lot of spare time.
Instead I just back up my user files. I have two working Linux computers right now, and if I need to rebuild one, I can do a reinstall and get my preferred applications set up fairly quickly.
There are a LOT of excellent backup tools available for free in Linux. I’ve meant to try many of them, but I stick with what I know, which is
Normally I have a hefty
exclude file to make my rsync backups go more quickly. I have that set up on my old Fedora 32 system, but I haven’t bothered to do it on my current Debian Bullseye laptop.
I’m just doing the full
/home directory. I could use
sudo to collect the few config files here and there that seep to crop up, but I’m NEVER going to need those files, so it’s not necessary. I would need to use
sudo if I had more than one user and wanted rsync to backup all user files and not just those under my account.
I set up a 1 TB spinning USB backup drive with an ext4 filesystem and the drive name backup. I used GNOME Disks to do this. Years ago, I used gParted for this, but nowadays I’m not doing very much partitioning, and I like how Disks works.
Then I use GNOME Files (aka Nautilus) to make sure the backup drive is mounted. I might back up more than one computer to the 1 TB drive, so I created a folder/directory named
debian for this backup.
I already have
rsync installed (
sudo apt install rsync in a terminal if you don’t), and I run the following:
$ rsync -av --delete /home /media/steven/backup/debian/
I let that run. It takes awhile for the first backup and is shorter after that.
I think it’s a good idea to create an
exclude file that lists directories you DON’T want to back up and to then reference that in the scrip with the
--exclude-from switch. There are tons of configuration files and caches in hidden directories that you really don’t need, and putting them in an
exclude file makes your backup go much more quickly — and makes it smaller, too.
exclude switch looks something like this:
I also recommend setting up your rsync line as a shell script and running the script instead of typing the
rsync line with all your switches and file locations. That way you don’t have to think about having the right drives in the right order and making a mistake that nukes your good files. I’m lazy and have been going into my Bash history to remember the working
rsync line. I’ll get it together soon and create a shell script.
I also recommend using the
--dry-run switch when you are “developing” the rsync line for your particular setup. That way nothing “bad” will happen while you are figuring things out. I did
I make an rsync script so infrequently that I generally do a bit of searching and find a tutorial to make sure I’m doing it the right way. This one at Linuxize is pretty good. I also like this one from DigitalOcean.
I don’t just use
rsync for backups. I also use it over the network to send these
hugo blog files to my shared host. I have
ssh set up, and I have a simple rsync script that syncs my
/public directory with my shared hosting account’s
/public directory. It’s much quicker and more foolproof than using a GUI FTP client.
Rsync is good for situations where the files only “change” on one side of the transaction. For situations where the server is more dynamic and the files on it change, I use Unison to keep two (or more) filesystems in sync. But for situations where I have everything local and it just needs to go one way to the server or backup drive,
rsync remains my go-to.
Next: Pressing “pause” on the rebuild