Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is it even possible to shut it down?


Of course it is. If enough people were truly enraged by it, if some leader were to rile up the mob enough, it could be shut down. Revolts have occurred in other parts of the world, and things are getting sufficiently bad that a sizable enough revolt could shut AI down. All we need is a sufficient number of people who are angry enough at AI.


But it's just like, math. The math is out there now. You can't shut down math.


Good luck shutting down the LLM running on my MacBook.

The Pandora’s Box is open. It’s over.


a software update could easily cripple its ability to run on your local machine

unless you plan to never update again


> a software update could easily cripple its ability to run on your local machine

A software update collaborated on by Microsoft, Apple + countless of volunteer groups managing various other distributions?

The cat really is out of the bag. You could probably make it a death penalty in the whole world and some people would still use it secretly.

Once things like this run on consumer hardware, I think it's already too late to pull it down fully. You could regulate it though and probably have a better chance of limiting the damages, not sure an outright ban could even have the effect you want with a ban.


the nvidia/AMD/apple chips all require proprietary firmware blobs, it can be enforced in there

yes you won't get people that won't ever update, but you'll get the overwhelming majority

and the hardware the never-updaters use will eventually fail and won't be able to be replaced

also: ban the release of new "open" models, they will slowly become out of date and useless

combine these, and the problem will solve itself over time


> they will slowly become out of date and useless

Models released today are already useful for a bunch of stuff, maybe over the course of 100 year they could be considered "out of date", but they don't exactly bitrot by themselves because they sit on a disk, not sure why'd they suddenly "expire" or whatever you try to hint at.

And even over the course of 100 year, people will continue the machine learning science, regardless if it's legal or not, the potential benefits (for a select few) seems to be too good for people to ignore, which is why the current bubble is happening in the first place.


your hardware won't last 100 years

> And even over the course of 100 year, people will continue the machine learning science

the weak point is big tech: without their massive spending the entire ecosystem will collapse

so that's what we target, politically, legally, technologically and regulatory

we (humanity) only need to succeed in one of these domains once, then their business model becomes nonviable

once you cut off the snake's head, the body will die

the boosters in search of a quick buck will then move onto the next thing (probably "quantum")


I think you over-estimate how difficult it is to get "most of the world" to agree to anything, and under-estimate how far people are willing to go to make anything survive even when lots of people want that thing to die.


> I think you over-estimate how difficult it is to get "most of the world" to agree to anything

agreement isn't needed

its success sows the seeds of its own destruction, if it starts eating the middle class: politicians in each and every country that want to remain electable will move towards this position independently of each other

> and under-estimate how far people are willing to go to make anything survive even when lots of people want that thing to die.

the structural funding is such that all you need to do is chop off the funding from big tech

the nerd in their basement with their 2023 macbook is irrelevant


Plenty of past civilizations have thought they were invulnerable. In fact, most entities with power think that they can never be taken down. But countless empires in the past have fallen, and countless powerful people have lost their wealth and power overnight.


There's a big difference between a civilization being taken down, and civilization being taken down.


It's just software running on a server...this isn't a Johnny Depp movie [1]. Just flip the power switch on the racks.

[1] https://www.youtube.com/watch?v=0jg3mSf561w


Rather, it’s many different types of software running on many different systems around the world, each funded by a different party with its own motives. This is no movie…


True, but the system only exists because it is currently economically viable. A mass taboo against AI would change that. And many people outside of tech already dislike AI a lot, so it's not inconceivable that this dislike could be fuelled into a worldwide taboo.


> True, but the system only exists because it is currently economically viable.

The "system" isn't a thing, but more like running apps, some run on servers, other consumer hardware. And the parts that run on consumer hardware will be around even if 99% of the current hyped up ecosystem dies overnight, people won't suddenly stop trying to run these things locally.


And every single one has a power switch.

I get the general "too many variables" argument, but the idea that humans have no means of stopping any of these apps/systems/algorithms/etc if they get "out of control" (a farce in itself as it's a chat bot) is ridiculous.

It's very interesting to see how badly people want to be living in and being an active participant in a sci-fi flick. I think that's far more concerning than the AI itself.


Hmm, good point. Also, when COVID struck, although it took some time, everyone collectively participated in staying home (more or less, I know some people didn't but the participating was vast). We can do the same if we choose.


"Skynet was software; in cyberspace. There was no system core; it could not be shut down"

Yes. Look at how much trouble we have now with distributed denial of service attacks.

Go re-read "Daemon" and "Freedom™", by Daniel Suarez (2006). That AI is dumber than what we have now.


On the other hand, if those fighting Skynet were asked to trade Skynet for the AI we have now, they would take it as their new enemy in a heartbeat.


Eh, it's exactly the Johnny Depp movie that would simplify this into "just flip the power switch".

LLM code already runs on millions of servers and other devices, across thousands of racks, hundreds of data centers, distributed across the globe under dozens of different governments, etc. The open source models are globally distributed and impossible to delete. The underlying math is public domain for anyone to read.


Sure, but those millions of servers and devices are not directly connected (nor can be by the AI). The plot in the movie I shared necessitated the AI being able to turn any computer into extra compute for itself—what's necessary for a "we can never shut it down" scenario.

The power switch is still king, even if it's millions of power switches versus one.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: