It’s good to be back and blogging again about potential issues or tutorials when operating or deploying equipment. In this instance, Ubuntu 14.04 with Samba fails to load nmbd on startup (at least especially when using a statically assigned IP), preventing it from starting a NetBIOS Name Server advertising it’s hostname on the local network. There is a pretty easy solution/workaround for this however, so we’ll be brief.
You need to open up /etc/init/nmbd.
A problem I’ve been having recently with a couple of Ubuntu 14.04 servers is receiving a “Waiting for network configuration” and “Waiting up to 60 seconds more for network” messages on boot, despite all network functions being available once it’s finished booting. I’ve so far tracked it down to an issue with the servers that have both static IP addresses set and Samba installed through Aptitude.
The fix I found for it is updating your /etc/network/interfaces file so that the static IP address assignment looks something like the following:
Sorry for the relatively long time without updating, I haven’t been working much on the Operations side, either confidential programming or relaxing (quite substantially too).
So I noticed my Samsung Galaxy S III had been becoming quite sluggish as of late and even a factory reset and reconfigure didn’t seem to fix things, so I decided to finally look at flashing another rom and went with CyanogenMod.
My choice was mainly based on popularity having heard it mentioned the most (in good terms) and with it’s positive reputation and light footprint I was easily swayed (I’m using past tense but am writing this up as I go along).
So the reason I needed to back up one of the SD Cards from my RaspberryPi earlier (and I wrote a tutorial about it) was I wanted to explore PirateBox, a free software implementation of a private network for users to anonymously connect, chat and share files.
I utilised the tiny, 802.11n USB adaptor that I got with my RaspberryPi and a 16GB thumb drive I had lying around to build it.
Welcome again to another short Adam O’Grady tutorial! This time I’m going to run through the brief process of cloning an SD card. This is particularly useful for backing up things such as images of Raspberry Pi installs and can be extended to use with USB flash drives that are used for boot images as well (or actually most other drive types).
Disconnect the SD card you want to back up, open up a Terminal prompt and run df -h.
As mentioned previously, I’ve been interested and have started using rsync as my backup utility for backing up important files. I thought I would right an extremely brief post on some of my thoughts for setting this up. For the sake of security, I recommend backing up over SSH with the -e ssh option when backing up to remote servers, but to get this working automatically through cron you’ll also need to set up SSH access without a password for your root user.
If you have a computer with two Network Interface Cards (NICs) set up, you can set them in the middle between your gateway/router and the rest of your network, by plugging an Ethernet cable from the router to said computer, then either Ethernet from computer to rest of network or use the computer to create a wireless network (with tools such as hostapd) which other clients can connect to. We don’t aim to cover this in particular, however we will give a good setup you can use if you’re router is running DD-WRT (a feature-packed distro for Linux-based consumer routers).
I’ve recently decided to repurpose my HP N40L Proliant MicroServer, loaded in some spare 500GB hard drives (bringing the total to 4x 500GB split between Western Digital Caviar Blue and generic Hitachi drives) and put Ubuntu Server 14.04 on it.
Preface HP produces a brilliant (in my experience) line of microservers under the ProLiant label and I managed to pick up an N40L some time ago on special at my local computer shop.
Outdated Given the work that’s gone into my tech setup so far, I think it’s time to step back and take a look at the current tech infrastructure setup that I have going on.
File Server Hardware AMD Athlon X4 processor 8GB RAM 6x 2TB Western Digital Caviar Green hard drives Services GitLab CE site running as private code repository and version control system CIFS/SMB file shares for 5x of the drives provided by Samba rsync-based backup system duplicating important files on a secondary drive and also on the remote server rsync-based backup also collects and stores database backups from development/production servers Development Server Hardware AMD E350 Fusion platform 4GB RAM 60GB SSD Services Dropplet-based blog for testing themes, reporting on mental health and as an outlet for Snowy Ghost-based blog for testing themes and miscellaneous purposes DNS server to allow easier access to resources within the private network of my home Dokku PaaS implementation to allow efficient rollout of development applications MongoDB instance to store key-value data Remote Server Hardware $20 Digital Ocean instance Dual core processor 2GB RAM 40GB SSD 3TB monthly data transfer Services Main website host Dropplet-based blog for keeping track of Operations/Administration adventures and providing tutorials on these topics Ghost-based blog for keeping track of Development adventures and providing tutorials on related topics Dokku PaaS implementation to allow efficient rollout of production applications MongoDB instance to store key-value data
Alright, despite what I said in a previous article, my Dokku installation didn’t actually go through at the time on either the development or production server for some reason.
I’ve since rectified that and managed to get a full install on both machines and even gotten as far as pushing an app to my development server before discovering one tricky flaw that I probably should have seen; Docker containers don’t interact normally with the rest of the machine.