Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I used to love running my own servers with all the services etc. I’d manually write beautiful bash scripts to keep it all nice and easy to rebuild on the fly. My first job had 10 Ubuntu servers (on site) and I was the only guy who used Linux at home and had experience with sql.

I have never volunteered to maintain servers since, it was horrible and everything was always my fault (it kinda was, I was a hobbyist at best with no real production Linux experience.)

I do still end up as the dev ops/infra guy at every place I’ve worked but at this point I’m probably one of those stubborn senior guys who wouldn’t like the way the juniors went about it.



Yeah I tried self hosting everything. Getting it actually running is the easiest part. Its the maintenance, backups, and security that are 90% of the job. You can get it working pretty easily and forget about it and it will run for a while until something goes wrong or it needs to be upgraded.

Now I'd rather leave hosting to a someone dedicated to it who has internalized the latest state of things for all the relevant bits of software and is constantly keeping this knowledge in their brain. Set and forget self hosting can't work in the current environment we have where things require constant security updates and complex security hardening.


For home hosting the trick is KISS.

I used to backup to external drives. Now I use bare ones since finding big externals got difficult.

I use (and probably abuse) docker compose. K8s is great but compose is easier.

I use a single makefile. Kinda ugly but it's fine.

Bunch of friends and family use my "services". They usually chip in for hard drives and stuff.

I have a few central points of failure but it keeps things easy. My uptime still beats most big clouds - though I have it easier.

I accidentally took down my server for a few days from a botched hardware install. It's a bit funny because now we realize how critical the home server has become to us.. on the other hand, already got the spouses blessing to build a backup standby server.


I've recently started running unraid at home on an old desktop PC and it's really nice. I've also migrated my unifi controller, plex server and pihole to it and it's very easy. Way nicer than the previous setup where I had random dedicated devices each needing their own type of maintenance (unifi controller on my gaming pc needed me to download/install updates manually, plex server hardly received any updates running on old windows laptop and I was always worried about breaking it, and I almost never looked at the pihole running on a rpi).

Now I have a single dashboard and can upgrade each container with a single click, and everything stays on the happy path.


Sounds like you might've had an unusually bad experience. Might've also been the distro; I don't like Ubuntu much myself. :P

Maintaining inherited environments is also much more painful than ones you get to design from the ground up. I work with varied environments, and one with ~250 RHEL / CentOS machines has approximately the same level of maintenance burden as another with a dozen or so Ubuntus because the first environment has had configuration management from the beginning and the second is a complete mess that I've slowly tried to reverse-engineer and clean up.

When your change management works, maintaining a dozen servers isn't all that different from maintaining a thousand or more; and the need for change management and automation doesn't really go anywhere even when you don't self-host things.


What do you suggest as a maintainable distro?


I like RHEL and derivatives more for servers myself. It's probably just preference, but I find that RHEL-like distros step on my toes less often. In particular, I don't like debconf at all, and Ubuntu pushing snaps everywhere also leaves a bad taste in my mouth.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: