As someone who uses Mobile Linux, I am pretty excited to see this, but I can't help but wonder if this is only a "Business decision" and not necessarily Qualcomm turning over a new leaf for being FOSS friendly:
I'm personally rooting for "business decision" over "turning over a new leaf".
If FOSS support is motivated by a clear profit motive, then it'll be viewed positively by shareholders and stick around no matter who is in charge. If FOSS support comes from "turning over a new leaf", it could be dropped at a moment's notice in response to a leadership change.
IMO we will always see far better FOSS support from the private sector when the time they invest has a positive ROI that is obvious and easy to brag about in a quarterly earnings call.
Incentives trump feelings for publicly traded companies 99 times out of 100. People constantly anthropomorphize them, but they aren't people (regardless of similarities in the law), and they definitely don't act like people, at least normal ones. At best, you can view them as something like a sociopath. I wouldn't look at a sociopath acting nicer and think "oh, they turned over a new leaf" because they aren't just going to change how their mind works, I'd think "oh, they found a reason to act in a way I like for the time being. I hope it isn't short lived."
I like to call them slow-AI. They are paperclip optimizing AIs. No single component wants the larger outcomes, yet they happen. These slow-AIs are terraforming our planet into a less habitable one in order to make GDP number go up, at any cost.
People changed environment even before these optimizations. I think now it's more a problem of fast enough "catch-up and converge", for example for CO2 : https://ourworldindata.org/grapher/co-emissions-per-capita?c... - if the rich countries would reduce a bit faster (using better technologies) then those technologies could be used by the others and impact would be reduced.
It would be great if we could engineer our way out of this situation, but we can't. For many years I strongly believed in our cleverness, after all I was clever and in the narrow domain I worked in - tech - cleverness was enough to overcome most issues. So why not human climate change?
In Tom Murphy's words:
> Energy transition aspirations are similar. The goal is powering modernity, not addressing the sixth mass extinction. Sure, it could mitigate the CO2 threat (to modernity), but why does the fox care when its decline ultimately traces primarily to things like deforestation, habitat fragmentation, agricultural runoff, pollution, pesticides, mining, manufacturing, or in short: modernity. Pursuit of a giant energy infrastructure replacement requires tremendous material extraction—directly driving many of these ills—only to then provide the energetic means to keep doing all these same things that abundant evidence warns is a prescription for termination of the community of life.
> It would be great if we could engineer our way out of this situation, but we can't.
I think it would be much more honest to say we don't know so we shouldn't bet everything on one approach.
Humans care about survival and will impact the world. It is exactly what all other animals do, and there is a dynamic equilibrium: too many predators => reduced prey => less predators. I don't think it's fair to think we humans are special. Or should we blame the algae for one of the previous mass extinctions?
I do think it is reasonable to take more care about the environment (co2, pollution, etc.) than we do because we need it in order to live well (not because I just want a nice Earth). I think most people agree with that, and are slowly adapting. Will see if fast enough.
Our viewpoints don't seem that far apart and thanks for the nuanced take. Personally I believe we know that technology can't fix this by definition because the problem is of social, cultural and economic nature. Our lifestyles are woefully incompatible with a 100k year horizon, even a 100 year horizon in many areas. Our perception of wealth depends on never ending growth, our welfare systems depend on never ending growth, our economies depend on never ending growth. It seems implausible to the point of impossibility that our economies can grow forever [1]. Technology is good at reaching goals e.g. going to the moon is unlikely without science and technology. But in this case the problem is the goal itself. Technology won't motivate us to let go of our conveniences.
Snapdragon does poorly I think because it's a bet if it works or not. Windows runs things seamlessly other than OpenGL (it can run that too but it's not anything strait forward - needs the gl to dx store app thing) but the other reason is cost. for the premium business laptop most buyers (business) won't budge off Intel even because of the "no one got fired for buying IBM" mentality at the big Enterprises Ive been at.
I will say with my 8 gen 3 snapdragon I'm impressed and also disappointed - stupid thing needs active cooling and I'm pretty sure it's bad enough that it's desoldered or damaged the core or something from heat but also you can't get driver updates for the GPU if you wanted because Qualcomm be the way it do.
Driver update depends on your OEM. Both ARM and Qualcomm send driver updates for their premium and upper highend Socs. The support reaching your phone is on the OEM. Google has started to push direct GPU driver updates starting with Pixel 10. So, hopefully others may follow too.
Usually GPU vendors (Nvidia, Intel, AMD) provide a way to download and install drivers manually (on Windows), including specific versions or older versions. Qualcomm is an outlier in this case.
Note that those drivers usually only work well in desktops, on laptops the GPU might have gone through OEM adaptations on the motherboard integration, and a driver from GPU vendors might have issues.
A common example is overheating, because the way the OEM has done their device isn't a setup that the driver knows about.
Which is why on laptops, the drivers if available have to be from the OEM themselves.
It might be more fair to say that there's simply no standard way to do power management for discrete GPUs in a laptop, or to integrate such power management with Windows and whatever power management it's trying to do. And the lack of a clear "right" way to do things means laptop OEMs use this area for product differentiation with their own shitty special sauce software and firmware hacks.
If installing drivers that come directly from NVIDIA onto a laptop can cause that laptop's GPU to overheat in the sense of getting so hot it fails to function properly or has degraded reliability, that's entirely NVIDIA's fault. If by "overheating" you just mean drawing more power and causing the fans to get louder than they would in an out of the box configuration, the blame for that should be shared between NVIDIA, Microsoft, and the laptop OEM, but you shouldn't blame the user for doing something that should work and would work if those three vendors could cooperate.
I've used basically every Windows on Arm machine - I actually quite like my X Elite ThinkPad T14s Gen6, compared the the X13s - feels like they got everything right, that the X13s got wrong
Go into store, get to talk with a "genie", whatever upsells to meet their KPIs, and leave with another computer that counts to the world market share in desktop units.
Which for all practical purposes means Windows, macOS, Chromebooks, iPad Pro or Android tablets with keyboard for Dex, HyperOS, HarmonyOS NEXT , and zero GNU/Linux devices.
Of course it's a "business decision". Companies don't do things for any other reason. They see a benefit to upstreaming in this instance, and will do it again (or not) depending on whether or not they expect to see benefits in the future.
This is no different from any other company that has "embraced" open source.
It'll probably be as much of a second class citizen elsewhere (the real problem is the hardware hasn't as good as Apple Silicon laptops but has been in the same price class at the bottom) but it's good they chase everywhere rather than just one use case.
In the case of Linux, that issue is solely because of non-upstreamed drivers. With that, it can be a first class citizen just like any other processor.
It's second class on Windows because it doesn't support game DRM and generally performs worse for the price than an x86 laptop. About the thing it really has going for it is better battery life. Using Linux doesn't really change either of those problems, though it does get you away from the mess that is Windows 11.
1st party native software support is high and 3rd party native software support is higher than Linux. Both have feature complete userspace emulation layers for the 3rd party part (largely game focused) Windows doesn't need Proton for that. Both can run open source apps natively.
10y old laptops are still powerful enough for my usage. So a bit more battery life wouldn't hurt me if performance of an arm system provides at least as much in term of performance.
I am pretty sure 99% of the population is in the same situation.
The situation was 99% of Windows laptops sold ended up being the much cheaper x86 ones or similarly priced ones with more performance.
It's like phones where people say everyone will buy out any phone with more battery life and then the standard type phone is always actually what sells. Consumers will gladly pick more battery all else equal, but that's not what the snapdragon laptops have been.
Not that Apple gave consumers much choice, but when they switched over it was truly all else equal or better and it sold like hot cakes, despite many popular still needing emulation for the first year they still ran better than they would have natively in the other option anyways. Qualcomm doesn't get that nearly as easily because they are competing against the latest off all current x86 options, not a subset of older Intel options.
I'm hoping that changes with the newer Elites though. At least the performance seems to be getting there, if not the price yet.
I'd imagine it's purely because not doing it turned out to be PITA in the long term.
As with pretty much all other ARM cpu vendors that pushed for their own kernel fork just to have drivers that did not need to be okayed by mainstream kernel, it was faster iteration to deliver something working to their clients; but it was also PITA to their clients, especially when industry started demanding longer support for their devices
Their problem was that they had the performance claims and marketing of Apple but the implementation of Microsoft Teams. Apple M1 was shaky but all the groundwork was there and it took off. Qualcomm was highly questionable at best.
Software-wise: Ubuntu Touch, PostmarketOS, and Mobian are all actively maintained. Ubuntu Touch uses Lomiri as its UI which is somewhat bespoke (though they're working on disentangling it from the distro for packaging elsewhere), the others use various mobile Linux UIs (and there's a surprisingly large variety of options there).
> Their Snapdragon X laptop didn't do very well, and they likely realize an ARM Windows laptop will always be a second class citizen
Why? So far ARM laptops provide either vastly better battery life for the same performance or vastly better performance for the same battery life. Even versus discrete GPUs.
Within a couple years from now you're gonna look like an utter fool for buying x86 (and Nvidia / AMD / Intel GPU) unless Intel, AMD and Nvidia really pull their head out of the sand.
There's a few specific workloads like local LLM and legacy where you'd want a discrete GPU or x86, but otherwise it is looking like GG.
Well, in your article it already clearly states performance tanks as soon as you go on battery. By 20-40%..
On another very reputable Dutch site, you can see the Snapdragon consistently lead the Lunar Lake laptop, and that's with Lunar Lake set to maximum performance[0]
There is also a general logic to it: Apple M-series still handily beat anything Intel has, and Qualcomm's Snapdragons beat the M-series they follow up.
Maybe Intel can truly push x86 to unseen heights, who knows? There's nothing technically stopping them but so far it hasn't beared out. Similar with Nvidia, their RTX 3090 power limited at 340W got beat by an M1 maxed out at 120W. Why isn't the RTX 4090 or 5090 half the TDP?
- Their Snapdragon X laptop didn't do very well, and they likely realize an ARM Windows laptop will always be a second class citizen: https://www.techpowerup.com/329255/snapdragon-x-failed-qualc... .
- Likewise, Mobile SoCs are completely dependent on Android without proper upstreaming (which they haven't done in the past).
- They are seeing Valve spending time and money on FOSS support paying off, especially with their new hardware releases.
On the other hand, proper upstreaming of the chips give them much more flexibility for different linux-based OSes.