Az

Backups And Storage: Part 2

Success! It is alive! My reformatted file and GitLab server is alive and kicking! I’ve reformatted and partitioned the remaining 5x 2TB Western Digital Caviar Green drives and created a full-size ext4 partition on each one. They’re also set to automount on boot to folders in /mnt that are named based upon the drive’s purpose (miscellaneous, steamapps, backups, etc). I’ve also started a Samba daemon (and written a tutorial for it!) that automatically shares those folders on Windows (and Apple) networks so housemates can share and store files.

On the redundancy side of things, I’ve created some crontab nightly entries that utilise the rsync command and back up the most important folders onto the backups drive of the file server (another entry also backs up the GitLab server to that drive). But it’s not the most useful thing to have all your backups on the same machine - or even in the same house! So I also have some nightly entries that rsync all the data to my Digital Ocean remote server as an extra preventative measure.

The only problem I’m encountering at the moment is the “pictures” directory (all our old memories) that needs to be backed up to the remote destination is ~4GB and I’m on an Australian ADSL2+ connection (~1Mbps upload). So I’m doing it manually in bits and pieces to hopefully create an initial backup, after which rsync can intelligently decide if something is new/updated or already backed up.

Oh! I also installed Dokku on my development (home) and production (remote, Digital Ocean) servers. I haven’t had a chance to test it out yet, it would likely involve modifying my domain records for each app that I upload as I can’t use wildcard records at the moment. I think once I get back into development and modify some existing apps for use with Dokku, I’ll be able to run it through it’s paces and learn more about simple PaaS deployment.

Overall, I’m feeling pretty chuffed with the results so far.