> I can pull down a 20 year old exe and still run it today on Windows. Try doing the same with a Linux binary that's just a year old. There's no guarantee that it will be able to run based off some update that has happened
IMHO, you just compare two different things.
Traditional method of installing apps on Windows is packing all dynamic dependencies with it.
While on linux dynamic dependencies are shared between apps. So, there is nothing surprising that when you change the dependencies of the app, it stops working.
There are few ways to solve this and your are free to choose:
Aside from comparing two different things, as you correctly identify, I believe that even the author's original assertion just isn't true. Maybe for some exe files, but I doubt for all or even most.
I was involved in replacing Windows systems with Linux + Wine, because (mission-critical industrial) legacy software stopped working. No amount of tweaking could get it to work on modern Windows system. With Wine without a hitch, once all the required DLL files were tracked down.
While Wine may indeed be quite stable and a good solution for running legacy Windows software. I think that any dynamically linked legacy software can cause issues, both on Windows and Linux. Kernel changes may be a problem too. While Windows is often claimed to be backwards compatible, in practice your mileage may vary. Apparently, as my client found out the hard/expensive way.
> I was involved in replacing Windows systems with Linux + Wine, because (mission-critical industrial) legacy software stopped working. No amount of tweaking could get it to work on modern Windows system. With Wine without a hitch, once all the required DLL files were tracked down.
I moved from Windows 11 to Linux for the same reason: I was using an old version of Office because it was faster than the included apps: the full Word started faster than Wordpad (it was even on par with Notepad!) The Outlook from an old Office used less ram and was more responsive than the one included with Windows!
When I got a new laptop, I had problems with the installation of each the old versions of Office I had around, and there were rumors old versions Office would be blocked.
I didn't want to take the risk, so I started my migration.
> While Windows is often claimed to be backwards compatible, in practice your mileage may vary
It was perfectly backwards compatible: Windows was working fine with very old versions of everything until some versions of Windows 11 started playing tricks (even with a Pro license)
I really loved Windows (and AutoHotKey and many other things), but now I'm happy with Linux.
> I really loved Windows (and AutoHotKey and many other things)
oh, do you know - how can I configure e.g. Win+1, Win+2, etc to switch to related virtual desktops? And - how to disable this slow animation.. just switch instantly?
May be you have several ideas where I should search.
I'm use Linux as my OS for a long time, but now I need to use Windows at my job. So, I'm trying to bring my Windows usage experience as close as possible to so familiar and common on Linux.
> So, I'm trying to bring my Windows usage experience as close as possible to so familiar and common on Linux.
I see you were given an answer for the slow animation. For most UI tweaks, regedit is a good starting point.
You may also like the powertoys, but I suggest you take the time to create AHK scripts, for example if you want to make your workflow keyboard centric
> So, I'm trying to bring my Windows usage experience as close as possible to so familiar and common on Linux.
I did the opposite with the help of hyprland on arch, but it took me years to get close to how efficient I was on Windows, where there are many very polished tools to do absolutely anything you can think of.
There's no built-in way to set hotkeys to switch to a specific desktop. And my primary annoyance is that there's no way to set hotkeys to move a given window to a different desktop.
Well, there's always LD_PRELOAD and LD_LIBRARY_PATH on Linux. My experience has been that most of the time when older binaries fail to run, it's because they are linked against old versions of libraries, and when I obtain those library versions -- exactly the same as obtaining the DLLs for the Windows executable -- things usually work just fine.
You don't need to bundle anything from the system layer on Windows programs distributed as binaries. On Linux there is no proper separation of system libraries or optional libraries, everything could be both and there are no API / ABI guarantees. So "just bundle your dependencies" simply doesn't work. You cannot bundle Mesa, libwayland or GTK but you cannot fully depend them not breaking compatibility either.
On Windows side nobody bundles Windows GUI libraries, OpenGL drivers or sound libraries. On Linux side, system libs have to be somewhere in the container and you have to hope that it is still compatible.
You cannot link everything statically either. Starting with Glibc, there are many libraries that don't work fully or at all when statically linked.
I am sure this is true. But I seem to have had good results building static executables and libraries for C/C++ with cmake (which presumably passes -static to clang/gcc). golang also seems to be able to create static executables for my use cases.
Unless static linking/relinking is extremely costly, it seems unnecessary to use shared libraries in a top-level docker image (for example), since you have to rebuild the image anyway if anything changes.
Of course if you have a static executable, then you might be able to simplify - or avoid - things like docker images or various kinds of complicated application packaging.
> I am sure this is true. But I seem to have had good results building static executables and libraries for C/C++ with cmake (which presumably passes -static to clang/gcc). golang also seems to be able to create static executables for my use cases.
Depends on what you link with and what those applications do, I would also check the end result. Golang on top of a Docker container is the best case, as far as compatibility goes. Docker means you don't need to depend on the base distro. Go skips libc and provides its own network stack. It even parses resolv.conf and runs its own DNS client. At this point if you replace Linux kernel with FreeBSD, you lose almost nothing as function. So it is a terrible comparison for an end-user app.
If you compile all GUI apps statically, you'll end up with a monstorous distro that takes hundreds of gigabytes of disk space. I say that as someone who uses Rust to ship binaries and my team already had to use quite a bit nasty hacks that walk on the ABI incompatibility edge of rustc to reduce binary size. It is doable but would you like to wait for it to run an update hours every single time?
Skipping that hypothetical case, the reality is that for games and other end user applications binary compatibility is an important matter for Linux (or any singular distro even) to be a viable platform where people can distribute closed-source programs confidently. Otherwise it is a ticking time-bomb. It explodes regularly too: https://steamcommunity.com/app/1129310/discussions/0/6041473...
The incentives to create a reliable binary ecosystem on Linux is not there. In fact, I think the Linux ecosystem creates the perfect environment for the opposite:
- The majority economic incentive is coming from server providers and some embedded systems. Both of those cases build everything from source, and/or rely on a limited set of virtualized hardware.
- The cultural incentive is not there since many core system developers believe that binary-only sofware doesn't belong to Linux.
- The technical incentives are not there since a Linux desktop system is composed of independent libraries developed by semi-independent developers that develop software that is compatible with the libraries that are released in the same narrow slice of time.
Nobody makes Qt3 or GTK2 apps anymore, nor they are supported. On Windows side Rufus, Notepad++ etc. are all written on the most basic Win32 functions and they get to access to the latest features of Windows without requiring huge rewrites. It will be cursed but you can still make an app that uses Win32, WPF and WinUI in the same app on Windows, three UI libraries from 3 decades and you don't need to bundle any of them with the app. At most you ask user to install the latest dotnet.
Except that the "you" is different on each case. You're offering options for the distributor. The quote is talking about options for the user, who has to deal with whatever the distributor landed on. From the point of view of the user at the point of need, a distributor having choices that could have made their lives easier if they'd been picked some time in the past is completely useless.
I think it's not quite simple though. For one, I think the opengl driver situation is complex, where I hear you need userland per-hardware libraries which basically require dynamic linking. From that perspective windows binaries are the de-facto most stable way of releasing games on linux.
I'm not sure about linux syscall ABI stability either, or maybe other things that live in the kernel?
> I think the opengl driver situation is complex, where I hear you need userland per-hardware libraries which basically require dynamic linking
Yes. OpenGL driver is loading dynamically, but..
Are you sure that there are any problems with OpenGL ABI stability?
I have never hear about breaking changes in it
The OpenGL ABI is extremely stable but OpenGL drivers (especially the open source ones) also use other libraries which distros like to link dynamically. This can cause problems if you ship different versions of the same libraries with your program. This includes statically linked libraries if you did not build them correctly and your executable still exports the library symbols. Not insurmountable problems but still thinks that inexperienced Linux developers can mess up.
I was thinking the same thing. I've had loads of issues over the years when I have an archived EXE that gets angry about a missing DLL.
Likewise, as the author states, there's nothing intrinsic to Linux that makes it have binary compatibility issues. If this is a problem you face, and you're considering making a distro that runs EXEs by default through an emulation layer, you are probably much better off just using Alpine or one of the many other musl-based distros.
I need to buckle down and watch a YouTube video on this that gives examples. It obviously comes up in computer engineering all the time, but it's something I've been able to skate by without fully understanding; from time to time I see comments like this one that seem perfectly clear, but I'm sure there's still quite a lot of nuance that I could benefit from learning.
This is like the in-soviet-union joke about shouting "down with the US president" in front of the Kremlin. In this case, I too can run a 20 year old Windows binary on Linux wine.
IMHO, you just compare two different things. Traditional method of installing apps on Windows is packing all dynamic dependencies with it. While on linux dynamic dependencies are shared between apps. So, there is nothing surprising that when you change the dependencies of the app, it stops working.
There are few ways to solve this and your are free to choose:
- distribute the same as on Windows
- link statically