Docker is not reproducible, you just apt-get install what you want hoping that the RUN commands get you a version that is still compatible with your software and doesn't break anything. With NixOS, you can pin the specific hash of every software, and you don't need the entire Ubuntu 20.04 in a container to deploy a webapp.
How does apt-get install handle the following scenario?
You need the following software:
- You need the latest release of Erlang for your app
- A database (like Riak), whicn in turn happens to need an older version of Erlang
- The new Erlang needs the latest libopenssl
- The old Erlang needs an old v1.x libopenssl
- You used some open source software written in C++, so you need libstdc++
- Some of your own C++ code has exposed a bug in libstdc++, so you need yet another libstdc++ with your patches, but just for that software
- Etc
How does apt-get install allow you to have multiple versions of a given dynamic library (with the same SONAME) and multiple binaries (erl, clang, etc) installed at the same time?
The answer: it doesn't.
Nix has no problem handling everything I described above, and it handles it easily.
So, yes, you can pin versions with apt-get install, but you're out of luck if any of those packages have any transitive dependencies with different versions.
Also, the pinning guarantees are higher with nix: you can be guaranteed that your packages and their transitive dependencies are byte-for-byte identical, every time. Pinning end-to-end across all packages (and every transitive dependency thereof) is not a normal, happy-path thing to do in apt, so you would be hard pressed to find anyone that does that, given how onerous that would be -- whereas it's trivial in Nix.
What's the model of the pinning? Is it an arbitrary semver? Is it a sha256 calculated by the content of the package (Content Addressed)? Is it a checksum calculated from all the inputs that were used to create the package install spec (Input Addressed)?
At best, you can do something like this: apt-get install gparted=0.16.1-1
That handles the semver case, but that doesn't address the rest of the "all of the above":
- sha256 calculated by the content of the package (Content Addressed)
- checksum calculated from all the inputs that were used to create the package install spec (Input Addressed)
> Nix isn’t the first package manager to do something unique.
Sure. But it is the first package manager to do the above unique things. After all, the reason for Nix's existence is just that: no other package manager before it has done what it does.
Because the hardware is really good. Better than any other hardware you can find (especially for a performance/watt standpoint, which for a laptop is somewhat important). Touchpad and keyboard are also top tier, screen is really nice, speakers are good and I ran out of adjectives.
Obviously everyone has to make their own decisions on tradeoffs, but for me there's no such thing as hardware good enough to put up with all the complications that Asahi Linux will face for some time to come (as they work to iron out all the kinks).
> especially for a performance/watt standpoint, which for a laptop is somewhat important
I think it is more important to have a working OS with common functionalities working.
Sure I'd love my thinkpad to have 15hours of battery life. But then I am not living in a tent either so I'd rather be able to use the onboard webcam, plug an external monitor, sound working without having to plug an external card and stable wifi. And I'd throw a 500gr powerbank in the backpack the handful of times I really need longer battery life.
Have you watched the video? It doesn't matter that the hardware is good, if you can't use most of it under Linux. Thunderbolt 4 becomes USB 2, HDMI? - no, camera? - no, sound? - no, HDR display? - no, battery life? - likely sucks too. A decent AMD Ryzen 3/4 laptop with a good screen will be a much better choice for Linux.
It has been tested. It has been sold and used commercially and by the government for $300 before. This is not an example of actual costs being $50k. It is an example of massive incentives on the seller’s side to rip off the one who pays (the US pop), and no incentives on the buyer’s side to care about it.
If it is the wrong metal, or wrong enough alloy, merely by touching the things it is in contact with can cause a galvanic reaction. Several airplanes have crashed due to damage caused by galvanic corrosion. The FAA is frequently called a "tombstone agency" because they tend to only take action after a bunch of people die.
I personally woudn't mind not having moderators, as the ones we have at the moment are really bad and have pushed their political views in apparently impartial subs (like /r/politics).
I mean, why not create your own subreddit then? All you have to do is create it with a small group of people to see it, post good articles, and keep a 24/7 watch to kill the comment and post spam going to viagra and obvious trash.
So to be clear: you're fine with all the moderators leaving, but you don't want to moderate, but you also don't want there to be no moderation. You would like an arbitrarily "good" moderation you agree with completely, from someone else. Fantastic. Online community in a nutshell.
I think the general reddit community underestimates how important moderation is. Yes, you have some moderators who go overboard, push a personal agenda on something that's supposed to be a neutral subreddit, etc. But subreddits without active moderation very quickly (in order of increasing tragedy) lose their focus, get overtaken by spam, or become hangouts for Nazis. reddit closes unmoderated subs for a reason.
Also a large chunk of the people who complain about moderators are actually upset that moderators stop their subreddits from being overtaken by Nazis. If you want completely unmoderated anonymous public discourse, go to 4Chan and see how it goes there. Though even that is moderated.
I don't want zero moderation, I just want good moderation, which is not the case when you have a few mods that push their agenda over many enormous subs.
I frequented /r/cfb and /r/collegebasketball, which are terrific and very on-topic. Very little drama there. I can easily imagine those going to the dumpster without their mod teams.
But with the Vision Pro there is zero latency and the possibility of having many big screens, whilst when remote-desktoping into a Mac from an iPad you gain nothing and the high latency makes it worse.
Because it's not remote, if you are streaming from your Mac, it's right there, at 2m at most. It's much different than remote connecting from an iPad 100 km away.
It wouldn't, but why would you use an iPad as your main screen if you are near a Mac? It's just a screen, it doesn't bring anything special like the Vision Pro would.
Rust's philosophy (unlike Python's, for example) consists in not including tons of stuff in the standard library, this way you can choose the best implementation for your specific use case from one of the many crates (which are super easy to install, by the way). There is no "always the best" implementation for a specific functionality, nor a legacy-but-still-officially-supported-in-std implementation that nobody uses anymore but still needs to be maintained with its own namespace.
I don't see this as negative or "reinventing the wheel". Reinventing the wheel would be writing your own implementation, which doesn't happen if you can choose from many high-quality crates.