Only because it can now. I think that dimension is mostly tapped out as well:
As I go to a complex website, much of the software to use it gets assembled in real time, on the fly, from multiple different networks.
It still sounds ridiculous: when I want to use some tool, I simply direct my browser to download all the software from a cascade of various networked servers and it gets pasted together in real time and runs in a sandbox. Don't worry, it takes only a few seconds. When I'm done, I simply discard all this effort and destroy the sandbox by closing the page.
This computer costs a few hundred dollars, fits easily in my pocket and can run all day on a small battery.
It has become so ordinary that almost nobody really even contemplates the process, it happens dozens of times a day.
I don't see any room for dramatic future improvements in actual person hours there either. Even if there was say 2 generations hence, some 7G, where I can transfer terabytes in milliseconds, how does this change how the software is written? Probably won't.
Probably the only big thing here in the next decade or so will be network costs being eventually seen as "free". One day CDNs, regional servers, load balancing, all of this will be as irrelevant as the wrangling needed with near and far pointers in programming 16-bit CPUs to target larger address spaces which if you're under 40 or so you probably have to go to wikipedia to find out what on earth that means. Yes, it'll all be that irrelevant.
I mean, the browser paradigm is already in its 2nd generation, from initially on the mainframe to being reimplemented in functions as a service. And browsers are getting a little bit smarter about deploying atomic units and caching their dependencies. Remember using jquery from a CDN? Oof.
The only saving grace is that Javascript is willing to throw itself away every couple of years.
As I go to a complex website, much of the software to use it gets assembled in real time, on the fly, from multiple different networks.
It still sounds ridiculous: when I want to use some tool, I simply direct my browser to download all the software from a cascade of various networked servers and it gets pasted together in real time and runs in a sandbox. Don't worry, it takes only a few seconds. When I'm done, I simply discard all this effort and destroy the sandbox by closing the page.
This computer costs a few hundred dollars, fits easily in my pocket and can run all day on a small battery.
It has become so ordinary that almost nobody really even contemplates the process, it happens dozens of times a day.
I don't see any room for dramatic future improvements in actual person hours there either. Even if there was say 2 generations hence, some 7G, where I can transfer terabytes in milliseconds, how does this change how the software is written? Probably won't.
Probably the only big thing here in the next decade or so will be network costs being eventually seen as "free". One day CDNs, regional servers, load balancing, all of this will be as irrelevant as the wrangling needed with near and far pointers in programming 16-bit CPUs to target larger address spaces which if you're under 40 or so you probably have to go to wikipedia to find out what on earth that means. Yes, it'll all be that irrelevant.