> Software technology is in decline despite appearances of progress. While hardware improvements and machine learning create an illusion of advancement, software's fundamental robustness and reliability are deteriorating. Modern software development has become unnecessarily complex, with excessive abstraction layers making simple tasks difficult. This complexity reduces programmer productivity and hinders knowledge transfer between generations. Society has grown to accept buggy, unreliable software as normal. Unless active steps are taken to simplify software systems across all levels, from operating systems to development tools, civilization faces the risk of significant technological regression similar to historical collapses.
I haven't watched that talk by Blow yet so maybe he covers my concern.
I think you have to be mindful of incentives structures and constraints. There's a reason the industry went down the path that it did and if you don't address that directly you are doomed to failure. Consumers want more features, the business demands more stuff to increase its customer base, and the software developers are stuck attempting to meet demand.
On one hand you can invent everything yourself and do away with abstractions. Since I'm in the embedded space I know what this looks like. It is very "productive" in the sense that developers are slinging a lot of code. It isn't maintainable though and eventually it becomes a huge problem. First no one has enough knowledge to really implement everything to the point of it being robust and bug free. This goes against specialization. How many mechanical engineers are designing their own custom molding machine in order to make parts? Basically none, they all use mold houses. How many electrical engineers are designing their own custom PCB(A) processing machines or ICs/components? Again, basically none. It is financially impossible. Only in software I regularly see this sentiment. Granted these aren't perfect 1-to-1 analogies but hopefully it gets the idea across. On the other hand you can go down the route of abstractions. This is really what market forces have incentivized. This also has plenty of issues which are being discussed here.
One thought that I've had, admittedly not fully considered, is that perhaps F/OSS is acting negatively on software in general. When it comes to other engineering disciplines there is a cost associated with what they do. You pay someone to make the molds, the parts from the molds, etc... It's also generally quite expensive. With software the upfront cost to adopting yet another open source library is zero to the business. That is there is no effective feedback mechanism of if we adopt X we need to pay $Y. Like I said, I haven't fully thought through this but if the cost of software is artificially low that would seem to indicate the business and by extensions customers don't see the true cost and are themselves incentivized to ask for more at an artificially low price thus leading to issues we are currently seeing. Now don't misread me, I love open source software and have nothing but respect for their developers; I've even committed to my fair share of open source projects. As I've learned more about economics I've been trying to view this through the lens of resource allocation though and it has lead me to this thought.
Interesting. My experience is that bulky abstraction layers are harder to maintain than own software.
In game development, whenever we go with highly abstract middleware, it always ends up limiting us in what we can do, at what level of performance, how much we can steer it towards our hardware targets, and similar. Moreover, when teams become so lean that they can only do high level programming and something breaks close to the metal, I’ve seen even “senior” programmers in the AAA industry flail around with no idea how to debug it, and no skills to understand the low level code.
I’ve seen gameplay programmers who don’t understand RAII and graphics programmers who don’t know how to draw a shape with OpenGL. Those are examples of core discipline knowledge lost in the games industry. Aka what we have now, we might not know anymore how to build from scratch. Or at least most software engineers in the industry wouldn’t. It cannot end well.
Building your own in my exp is a better idea — then you can at least always steer it, improve and evolve it, and fix it. And you don’t accidentally build companies with knowledge a mile wide and an inch deep, which genuinely cannot ship innovative products (technically it is impossible).
I no longer think that's true. Instead, I think consumers want reliability, but more features is a way to justify subscription pricing segregation and increases.
Everyone has a different a tipping point. But generally I see folks want more features and don't value reliability unless it's something they use really often and has no workaround.
I play games with known bugs, and on imperfect hardware, because I unwilling to pay more. Some experiences are rare, so I tolerate some jank because there aren't enough competitors.
Operating Systems are quite mature. I suppose they do need to evolve to take advantage of newer hardware and new UI conventions. For example, swipe typing and H264 decoding are table stakes on mobile.
Most large enterprise IT departments are fully aware that the cost of adopting yet another open source library is very high even if the price is zero. This cost comes in the form of developer training, dependency management, security breaches, patent troll lawsuits, etc. There is a whole niche industry of tools to help those organizations manage their open source bill of materials.
Adopting libraries and 3P solutions is like jumping in the pool, easy to do. Getting out of the pool is much harder. Or in some cases like jumping into quick sand. Sometimes it can be hard to tell which before you're in it.
I haven't watched that talk by Blow yet so maybe he covers my concern.
I think you have to be mindful of incentives structures and constraints. There's a reason the industry went down the path that it did and if you don't address that directly you are doomed to failure. Consumers want more features, the business demands more stuff to increase its customer base, and the software developers are stuck attempting to meet demand.
On one hand you can invent everything yourself and do away with abstractions. Since I'm in the embedded space I know what this looks like. It is very "productive" in the sense that developers are slinging a lot of code. It isn't maintainable though and eventually it becomes a huge problem. First no one has enough knowledge to really implement everything to the point of it being robust and bug free. This goes against specialization. How many mechanical engineers are designing their own custom molding machine in order to make parts? Basically none, they all use mold houses. How many electrical engineers are designing their own custom PCB(A) processing machines or ICs/components? Again, basically none. It is financially impossible. Only in software I regularly see this sentiment. Granted these aren't perfect 1-to-1 analogies but hopefully it gets the idea across. On the other hand you can go down the route of abstractions. This is really what market forces have incentivized. This also has plenty of issues which are being discussed here.
One thought that I've had, admittedly not fully considered, is that perhaps F/OSS is acting negatively on software in general. When it comes to other engineering disciplines there is a cost associated with what they do. You pay someone to make the molds, the parts from the molds, etc... It's also generally quite expensive. With software the upfront cost to adopting yet another open source library is zero to the business. That is there is no effective feedback mechanism of if we adopt X we need to pay $Y. Like I said, I haven't fully thought through this but if the cost of software is artificially low that would seem to indicate the business and by extensions customers don't see the true cost and are themselves incentivized to ask for more at an artificially low price thus leading to issues we are currently seeing. Now don't misread me, I love open source software and have nothing but respect for their developers; I've even committed to my fair share of open source projects. As I've learned more about economics I've been trying to view this through the lens of resource allocation though and it has lead me to this thought.