Of course it is. If enough people were truly enraged by it, if some leader were to rile up the mob enough, it could be shut down. Revolts have occurred in other parts of the world, and things are getting sufficiently bad that a sizable enough revolt could shut AI down. All we need is a sufficient number of people who are angry enough at AI.
> a software update could easily cripple its ability to run on your local machine
A software update collaborated on by Microsoft, Apple + countless of volunteer groups managing various other distributions?
The cat really is out of the bag. You could probably make it a death penalty in the whole world and some people would still use it secretly.
Once things like this run on consumer hardware, I think it's already too late to pull it down fully. You could regulate it though and probably have a better chance of limiting the damages, not sure an outright ban could even have the effect you want with a ban.
Models released today are already useful for a bunch of stuff, maybe over the course of 100 year they could be considered "out of date", but they don't exactly bitrot by themselves because they sit on a disk, not sure why'd they suddenly "expire" or whatever you try to hint at.
And even over the course of 100 year, people will continue the machine learning science, regardless if it's legal or not, the potential benefits (for a select few) seems to be too good for people to ignore, which is why the current bubble is happening in the first place.
I think you over-estimate how difficult it is to get "most of the world" to agree to anything, and under-estimate how far people are willing to go to make anything survive even when lots of people want that thing to die.
> I think you over-estimate how difficult it is to get "most of the world" to agree to anything
agreement isn't needed
its success sows the seeds of its own destruction, if it starts eating the middle class: politicians in each and every country that want to remain electable will move towards this position independently of each other
> and under-estimate how far people are willing to go to make anything survive even when lots of people want that thing to die.
the structural funding is such that all you need to do is chop off the funding from big tech
the nerd in their basement with their 2023 macbook is irrelevant
Plenty of past civilizations have thought they were invulnerable. In fact, most entities with power think that they can never be taken down. But countless empires in the past have fallen, and countless powerful people have lost their wealth and power overnight.
Rather, it’s many different types of software running on many different systems around the world, each funded by a different party with its own motives. This is no movie…
True, but the system only exists because it is currently economically viable. A mass taboo against AI would change that. And many people outside of tech already dislike AI a lot, so it's not inconceivable that this dislike could be fuelled into a worldwide taboo.
> True, but the system only exists because it is currently economically viable.
The "system" isn't a thing, but more like running apps, some run on servers, other consumer hardware. And the parts that run on consumer hardware will be around even if 99% of the current hyped up ecosystem dies overnight, people won't suddenly stop trying to run these things locally.
I get the general "too many variables" argument, but the idea that humans have no means of stopping any of these apps/systems/algorithms/etc if they get "out of control" (a farce in itself as it's a chat bot) is ridiculous.
It's very interesting to see how badly people want to be living in and being an active participant in a sci-fi flick. I think that's far more concerning than the AI itself.
Hmm, good point. Also, when COVID struck, although it took some time, everyone collectively participated in staying home (more or less, I know some people didn't but the participating was vast). We can do the same if we choose.
Eh, it's exactly the Johnny Depp movie that would simplify this into "just flip the power switch".
LLM code already runs on millions of servers and other devices, across thousands of racks, hundreds of data centers, distributed across the globe under dozens of different governments, etc. The open source models are globally distributed and impossible to delete. The underlying math is public domain for anyone to read.
Sure, but those millions of servers and devices are not directly connected (nor can be by the AI). The plot in the movie I shared necessitated the AI being able to turn any computer into extra compute for itself—what's necessary for a "we can never shut it down" scenario.
The power switch is still king, even if it's millions of power switches versus one.