So the reason I needed to back up one of the SD Cards from my RaspberryPi earlier (and I wrote a tutorial about it) was I wanted to explore PirateBox, a free software implementation of a private network for users to anonymously connect, chat and share files.
I utilised the tiny, 802.11n USB adaptor that I got with my RaspberryPi and a 16GB thumb drive I had lying around to build it.
Welcome again to another short Adam O’Grady tutorial! This time I’m going to run through the brief process of cloning an SD card. This is particularly useful for backing up things such as images of Raspberry Pi installs and can be extended to use with USB flash drives that are used for boot images as well (or actually most other drive types).
Disconnect the SD card you want to back up, open up a Terminal prompt and run df -h.
As mentioned previously, I’ve been interested and have started using rsync as my backup utility for backing up important files. I thought I would right an extremely brief post on some of my thoughts for setting this up. For the sake of security, I recommend backing up over SSH with the -e ssh option when backing up to remote servers, but to get this working automatically through cron you’ll also need to set up SSH access without a password for your root user.
If you have a computer with two Network Interface Cards (NICs) set up, you can set them in the middle between your gateway/router and the rest of your network, by plugging an Ethernet cable from the router to said computer, then either Ethernet from computer to rest of network or use the computer to create a wireless network (with tools such as hostapd) which other clients can connect to. We don’t aim to cover this in particular, however we will give a good setup you can use if you’re router is running DD-WRT (a feature-packed distro for Linux-based consumer routers).
I’ve recently decided to repurpose my HP N40L Proliant MicroServer, loaded in some spare 500GB hard drives (bringing the total to 4x 500GB split between Western Digital Caviar Blue and generic Hitachi drives) and put Ubuntu Server 14.04 on it.
Preface HP produces a brilliant (in my experience) line of microservers under the ProLiant label and I managed to pick up an N40L some time ago on special at my local computer shop.
Outdated Given the work that’s gone into my tech setup so far, I think it’s time to step back and take a look at the current tech infrastructure setup that I have going on.
File Server Hardware AMD Athlon X4 processor 8GB RAM 6x 2TB Western Digital Caviar Green hard drives Services GitLab CE site running as private code repository and version control system CIFS/SMB file shares for 5x of the drives provided by Samba rsync-based backup system duplicating important files on a secondary drive and also on the remote server rsync-based backup also collects and stores database backups from development/production servers Development Server Hardware AMD E350 Fusion platform 4GB RAM 60GB SSD Services Dropplet-based blog for testing themes, reporting on mental health and as an outlet for Snowy Ghost-based blog for testing themes and miscellaneous purposes DNS server to allow easier access to resources within the private network of my home Dokku PaaS implementation to allow efficient rollout of development applications MongoDB instance to store key-value data Remote Server Hardware $20 Digital Ocean instance Dual core processor 2GB RAM 40GB SSD 3TB monthly data transfer Services Main website host Dropplet-based blog for keeping track of Operations/Administration adventures and providing tutorials on these topics Ghost-based blog for keeping track of Development adventures and providing tutorials on related topics Dokku PaaS implementation to allow efficient rollout of production applications MongoDB instance to store key-value data
Alright, despite what I said in a previous article, my Dokku installation didn’t actually go through at the time on either the development or production server for some reason.
I’ve since rectified that and managed to get a full install on both machines and even gotten as far as pushing an app to my development server before discovering one tricky flaw that I probably should have seen; Docker containers don’t interact normally with the rest of the machine.
Success! It is alive! My reformatted file and GitLab server is alive and kicking! I’ve reformatted and partitioned the remaining 5x 2TB Western Digital Caviar Green drives and created a full-size ext4 partition on each one. They’re also set to automount on boot to folders in /mnt that are named based upon the drive’s purpose (miscellaneous, steamapps, backups, etc). I’ve also started a Samba daemon (and written a tutorial for it!
Alrighty then, it appears the file server component of my storage machine is finally up and running, has survived a reboot and had a test write&read from both Windows (via SMB/CIFS, you know, Samba) and Linux (via the host machine). So now I’m going to run through a quick tutorial in setting up Samba on Ubuntu 14.04 (I’ve used a headless Server edition but whatever floats your boat) so you too can work towards creating a networked file server.
I’ve always loved accessing things via SSH, it’s nifty, quick and way more secure than trying to telnet into a device. Something cool that I’ve setup recently though is SSH access sans password! It’s not incredibly difficult and the end result has some nifty benefits, particularly the ability to script SSH sessions with commands without you needing to be there to type a password. The first one that came to mind for me was cron and rsync combos for regular, secure, remote backups but it also allows you to do some neat stuff with PaaS (Platform-as-a-Service) implementations like Heroku or in my case, Dokku.