Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes! This is what experience has taught me too.

We tend to underappreciate the importance of time in everything. A button click can instantiate something powerful (and useful (and easy-to-use...)), but it will degrade over time, and eventually flat-out stop working.

I had a stack that worked just fine for my own needs, but it ran on shudder Python 2.7 -- everyone knows how that worked out (I chose to rebuild my stack on a different platform).



> A button click can instantiate something powerful (and useful (and easy-to-use...)), but it will degrade over time, and eventually flat-out stop working

Software doesn't degrade over time (other than, you know things like cosmic ray bit flips, but in most realistic situations that should be fully mitigatable.)

The needs of the software user (including hardware and software they want the piece of software to interact with) may evolve, but that's different than software degrading over time.

> I had a stack that worked just fine for my own needs, but it ran on shudder Python 2.7 -- everyone knows how that worked out

While there's no further first party support for that version of Python, if it worked properly before, Python 2.7 and the software running on it probably still works properly now.


I would absolutely use "degrade" to describe what happens to public-facing or Internet-connected software over time—eventually you'll have to upgrade it for security reasons, and you'll often find that this is way more involved than just upgrading the server-side package itself, or even its immediate dependencies. The alternative is even more work back-porting security patches. All this is assuming someone's actively working on the software you're self-hosting, at least enough to spot, advertise, and fix vulnerabilities.

Ditto the average Rails/Python/Javascript project, as anyone who's tried to resurrect one that's gone so much as six months without being touched can attest. Which might not matter except that a ton of the software people might actually want to self-host are in one or more of those high-entropy ecosystems. Extraordinary levels of care and organization on the part of the creators and maintainers can mitigate this, but that amount of taste and effort is vanishingly rare.

These are degradation due to a changing environment, sure, but I wouldn't describe it as due to evolution in the needs of the user (presumably "must not have any well-publicized remote vulnerabilities" was a need from the beginning).


This comment was brought to you by someone who never produced/maintained software that had to withstand a 24/7 onslaught of automated exploit kits and port scanners over an extended period of time.


Or written any software other than a one-off script, if I had to guess.


If your software is not publicly accessible, it may be possible for you to continue running on 10+ year old dependencies indefinitely. For anyone else, other than a hobbyist, it is just not practical.

Otherwise, you are going to be influenced by external factors (security vulnerabilities, wanting to use a feature only available on a newer language version or OS, etc.) If you are a business, you'll also run into more practical concerns, like engineers not wanting to work on a mountain of technical debt.


Sure, but my old Google cloud apps on python 2.7 will one day get rug-pulled and forced to upgrade. It can only stay working forever if the platform doesn't change underneath it.


> Sure, but my old Google cloud apps on python 2.7 will one day get rug-pulled and forced to upgrade

“Degradation over time” was being cited as a reason not to self-host. Pointing out that not self-hosting exposes you to risk of others changing the environment so it no longer supports your software is a diametrically-opposed argument.


Oops! I missed that point entirely.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: