Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
What will humans do if technology solves everything? (economist.com)
33 points by helsinkiandrew on April 14, 2024 | hide | past | favorite | 92 comments


Ian M Banks created a pretty fascinating world that tries to answer to this question in his culture series [0]. In the books, AIs let humans do pretty much whatever they want, since it doesn't matter for them. Whether a human wants to pilot a spaceship to explore the galaxies, or genetically modify themselves to live underwater as a sea cucumber for some time, they provide them with the resources needed. Of course, since post-scarcity in this interpretation means no dire conflicts, this would be boring for a book. So most of the story plays out at the edges of the culture and it's interplay with other civilizations, but the books are still full of fascinating insights imho.

[0]https://en.m.wikipedia.org/wiki/Culture_series


Didnt the ai have a bit of an ulterior motive in that series in that they needed ever changing/renewing humns to keep them from retreating into their own thoughts and going insane? It seemed a lot more l*ke all but a couple handfuls of humans who could still do something better than the ais were basically pets with a really nice pet ownership/care agreement...


The problem with envisaging how current technological trends will shape society in the future is that we try to work it out rationally, when people mostly take what they're given without thinking about it too much. The ability to listen to and watch recorded artistic performances - music, plays - ought to have decreased the value of live performances, but tickets to those performances are now more expensive than ever.

The comment on AI parenting isn't as revolutionary as it sounds: wealthy people raising their own children (rather than employing a dedicated nanny) is a 20th-century innovation, at least in the West. But I think people will feel unsettled about their children being raised by robots, regardless of the quality of the tech - Norlands College has little to worry about, I think.

But the central issue of this article is the issue I have with Universal Basic Income. If a person's labour is worthless, then the State has to provide some sort of income for them by taxing the business for which they would otherwise have worked. If this becomes ubiquitous throughout the economy, then the State becomes a dominating force in the economy, and the principles of the market economy start to break down. The idea that nothing can be produced without labour is a fundamental assumption to all economic models, and without it, we're in very dangerous territory.


>hen the State becomes a dominating force in the economy, and the principles of the market economy start to break down

But isn't that the second element of this? The principles of the market economy break down because most people won't have anything to offer in the market, not because the government is over-reaching by trying to provide support for people who no longer have any legal means of supporting themselves in a market where their value is now zero.

We're going to be in very dangerous territory, but it's not because of government meddling (which is how I read your comment as viewing UBI - sorry if this isn't the case, but there are quite a few free market opinions on here, so again sorry if I'm misrepresenting your views). It's because the very thing that all economic models are based on is about to have the rug pulled from underneath it. The models will need to be altered as production without labour is about to become much more common.


Live performance from a given artist doesn't scale. There are more people who want to watch Taylor Swift than who wanted to watch Elvis Presley because there are more people, and there is more opportunity to travel to the performance. Even if a modern top-tier artist can attract fewer people in terms of percentage (because people are happy watching on netflix or whatever), the price will still increase because the total number has increased

If there are 1M in the catchment area, and 50% want to see them, and there is seating for 50k, then prices will increase to what's affordable for the top-10%

If there are 3M in the catchment area, even if just 33% want to see them, seating is still 50k, then prices will increase to what's affordable for the top 5%

> But the central issue of this article is the issue I have with Universal Basic Income. If a person's labour is worthless, then the State has to provide some sort of income for them by taxing the business for which they would otherwise have worked.

Tax the land. Land value has massively increased over the years, everyone needs at least some land to live, even if it's just 100 square feet. Everyone has a fair claim to the land, unless you're a fan of feudal lords.

Your taxable business will not have a great business if nobody has any income. Doesn't matter how many widgets they make if nobody can afford them.


Reading many of the philosophies on how AI will solve all the problems - I have to wonder if those people ever worked a "real job". I am professional in IT sector just like majority here, but on occasion when I do some masonry, or plumbing or electrical work - I do appreciate and understand how much flexibility and creativity people have. Event with "simple software" that you build for real business (like waste collection, logistics - ie. not IT), you might see how much more complicated the real life is from the "lab" setup AI is evaluated in.

I believe with AI we will see future similar to self-driving cars: they will improve work, and some mundane repeated tasks (the tasks that might take 60-80% of time) - but they won't fully replace need for workers. Just that workers gonna be different - instead of repeatedly doing stuff they will have to make decisions, spot issues etc.

And then there is whole artisan theme. You can take for example bakery. One can purchase anything in super-market cheaper, but there are bakeries where it's just better and more preferred for those who can afford. Or coffee. Or whatever else that is already automated but we prefer to be served by human.


Even blue collar workers have imagination and they know the world is more than just masonry or trade. They are aware of physics and science and it rejuvenates them even as they dont actively interact. The fact those exist in the background is comforting. It means they share a portion of divinity with all the great people. When humans most celebrated jobs are replaced by ai, it is saying something. Its like if the sky is now brown poopy color. Sure its just a background for most people. But it matters a lot.


Fill out Jira tickets and attend meetings about the Jira tickets.


I do this now. We have meetings about how to create templates on how to write Jira tickets. Meta meta meta work.


Wait until you get to management. I have meetings about who to get to have meetings on how to create templates to create Jira tickets.

And for some reason this is valued and rewarded more than ROI generation tasks. We are truly fucked as a species.


aaaah fuck I was hoping it was just where I work that this was the loop.


It's down to the age of the business. I have the following hypothesis:

<5 years = sensible

5-10 years = JIRA incursion, agile factions

>10 years = Project manager and bureaucrat hostile incursion

I quite like the latter if you're working from home. You can get a LOT done on shit you actually care about when you're on conference calls just as part of a faction to make it look bigger against another corporate faction.


The Future Is Now.


The problems to be solved keep changing, at the very least because our very previous problems solutions. Even "not having anything to do" is a problem. And everything is an absolute that may be meaningless in the real world. The landscape will keep changing, our requirements will do too, the "solved problems" of the past doesn't mean that it keeps being something that needs humans (coffee machines are a reality, and still a lot of people work at coffee shops, to put a naive example)

But I give this article points for bringing up Greg Egan and Permutation City.


I imagine a lot of humans will be left to die, simply because they are no longer needed. See: those with insecure housing.

We might invent jobs for each other, but the earning potential of them will not be very high if the labour can be readily done by a machine.

Those who can live with their parents or have joint ownership with packs of friends could probably surf by paying the mortgage on gig work.

Of course, if housing suddenly becomes plentiful and cheap, renters might enjoy this gig economy too.


> I imagine a lot of humans will be left to die

Which would mean discontent and rebellion. I don't think any significant portion of the population would just die quietly. And from what you described, you are probably talking about 50-90% of the population.

Of course if you are cynical, you could just say that technology could "solve" this problem as well.


Imagine a world with 1% of the current population. There would still be millions of humans, the right amount ? A world given back to the wild.

Each person supported by a near limitless amount fusion energy and robot assistance, 500 years into the future, a long time after the cleansing of the undesirables. Sort of like what happened to the native americans, or the natives after any conquest really.


It probably will. Technological leaps have often led to new kinds of war which reshape the world order (e.g. industrial revolution -> WW1).

Meanwhile, the efficiency with which capitalism (and war) created an angry, resentful underclass sowed the seeds that allowed fascism to bloom in the run up to WW2.


If humans die then what exactly is technology “solving”?


In all brutal honesty it solves a cost center for companies.

Less people to employ, for the same or greater output equals more profits.

I am not a proponent of this development, just a realist.


> Less people to employ, for the same or greater output equals more profits.

Companies are under no obligation to employ people just because they exist, so they can already employ less people if they choose to. I don't see why anyone has to die.


I wasn't saying that corporations are required to hire and then keep their hires on staff. I was talking about the societal impact of the possibility that corporations would lay off large chunks of their employees, during a climate where no work was to be had due to technology taking over most of those jobs.

Being fired and being unable to find employment elsewhere often has serious financial, physical and mental consequences for people. Often times it can completely ruin a persons life.

Not everyone has savings, or even the possibility to create savings, making the impact of being made redundant even that much more harder to handle.

Severe life changing situations can often lead to drug and substance abuse which carry with them higher chance of death. These situations can often directly lead to suicide.

Not every society around the world has systems in place to capture those that fall down, and even those that do are going to have a difficult time handling the large influx of people needing assistance if the techno dystopian future in the article comes to pass.

I would recommend checking out the excellent book "The Body Economic: Why Austerity Kills by David Stuckler & Sanjay Basu" on the societal impact of austerity.


I agree with all of that. But unless I'm misunderstanding, your previous comment was saying that more dead people leads more profits for companies, and I don't agree with that.

> Being fired and being unable to find employment elsewhere often has serious financial, physical and mental consequences for people. Often times it can completely ruin a persons life.

Yeah, and that's because we've tied a person's survival needs to their ability to find "work". And in many cases "work" is not really a productive activity; it's just a facade we've all bought into. It's a useful facade though, because if we were all perfectly efficient and only rewarded true productive work then we'd be in deeper trouble.

So our challenge, it seems, is to somehow keep finding new "work" when technology makes it harder and harder for us to trick ourselves into believing we're working. "Prompt engineer" comes to mind.


> And in many cases "work" is not really a productive activity; it's just a facade we've all bought into.

https://en.wikipedia.org/wiki/Bullshit_Jobs


GDP

Seems like a weird metric to overfit to. Maybe because it's easily measurable.


It's because ultimately we live in a plutocracy and that is a measure of their collective success.


Share holder value of course.


Humans.


But this is exactly my point. We can already solve the problem of humans and we aren’t bound by technological constraints. If AI is there chugging along after all humans are rendered “useless” then is it much different than the pre-human era where humans weren’t there to do whatever it is we are doing?

The whole thought experiment of “what if AI makes humans obsolete” is pointless because it assumes there is some universal goal that transcends human existence which there is not. We are not here to do anything but maintain our own existence, and existing with as little effort as possible is actually our end goal. The universe has no other purpose than for us to exist in it.


This article has the typical fallacy of any post-scarcity writing: ownership. It assumes that we reach a point where the cost of producing goods becomes close to zero and we happily share it all.

Interesting premise for some philosophical musings, but terribly unrealistic. In fact, I think the first assumption is easier to achieve than the second.


Perhaps scarcity is the reason why humanity hates to share. But in a post-scarcity world, I imagine later generations used to that level of security (and with no memory of scarcity) might begin to think differently. I do agree that the first-generation that experiences post-scarcity won't change their habits a wit.


Well, we already have generations of ultra wealthy who essentially have lived post scarcity life. They don't seem so eager to share their wealth, quite the opposite even.


Their wealth is scarce in that they can't share it endlessly. No one is ultra-wealthy enough to say, "I'm going to give $1 million to random people so that people don't have to suffer anymore. That's a critical difference.


If it was in human nature to start sharing once you've hoarded enough, we would certainly start seeing it with our billionaire class. Instead, we're seeing the opposite. I think it's even been studied that the more wealth you have, the more selfish you become.

Plus, of course the physical limitation that no one can ever be rich enough to share endlessly.


When we talk about after scarcity, we're talking about an endlessly sharable amount of wealth. (Not in a technical sense, of course)

Then the super-rich will start sharing their wealth, because if they don't, their competitors will do it first and take all the fame. It is not from the benevolence of the super-rich, the butcher, the brewer, or the baker that we expect our dinner, but from their regard to their own interest.


Too subtle. Most readers probably won't recognize you were quoting Adam Smith there.


> their competitors will do it first and take all the fame.

I don't think I'll be holding my breath waiting for ultra rich to be motivated by fame of this type. In fact, I think it's wishful thinking in its finest form.


> Well, we already have generations of ultra wealthy who essentially have lived post scarcity life.

They live a post-scarcity life, but not everyone does -- or can (probably, still). Once everyone could, the greedy hoarder mentality they (and pretty much all of us) operate under right now might (hopefully?) come to be seen as pathological, not ideal.


If we don't earn money, how do we buy the goods the robots make abundant? What's the price when nobody has any money..

The whole point of money is lubrication under scarcity. If you don't have a job, you don't have money. No lubrication.


A small percentage of people control the vast majority of the money, and thereby control the output of the AIs. Everyone else most likely gets by on a UBI. If everything is truly abundant, they may well even have objectively better quality of life than the average person now, just no real ability to impact the world.


> If everything is truly abundant, they may well even have objectively better quality of life than the average person now, just no real ability to impact the world.

So no difference from current conditions, then, except the objectively better quality of life.


UBI is not inevitable. The capital class may well enjoy a world without the 99%.


I have never met a capitalist who wants such a world. If there is such a capitalist, it must be a friend of the infinite paperclip machine.


TBF, from what little we get to know of them via the media, many (most?) of them seem exactly the type for that.


be happy. travel, see the world, meet new people. have sex, raise kids, make human-made music, art, knickknacks. the economist would focus on joblessness but there's more to life than work. the singularity is in sight, but there are multitudes of problems to wrestle with before we can actualize that for everybody.

thing is, that lifestyle has been here for some ever since trust funds have existed. all the singularity will do is let the rest of us live that lifestyle.


Solves everything for everyone or solves everything for those who have economic means to adopt the solutions. One is a dystopia, the other is a utopia.


What will hunters and gatherers do once farming is invented and humans just sit and watch crops grow for most of the year?

We will find new stuff to keep us busy, no doubt.

It's utterly ridiculous to think we are at the end of the tech tree, just because we haven't seen the future yet.


There seems to be an innate characteristic of technology that if you solve a problem using it, two more problems spring out. Not sure why that is but it seems life has become ever more complex and yet at the same time its not getting better


> if you solve a problem using it, two more problems spring out.

It's because people tends to forget the problems and it's solution when a technology entirely solve it without creating new problems. ATMs for example doesn't create problems and solve a lots.


The realm of creative ideas has no boundaries. It cannot be "solved".


This raises the question though: do we want to create outside of the fact that when we create, it causes an influx of pleasurable neurochemicals? And if not, will it matter that we're not creating if the machines can just directly cause that influx and short circuit the whole process?

Even with crude street drugs right now many people forego fulfillment in life for hedonistic indulgence. Bring that down to a level of direct neural stimulation and a balanced out supplementation to avoid burnout and I seriously doubt that most people would actually miss the activities we think of as fulfilling in themselves.

Now, whether or not it's actually practical to ever get that far is another question -- one more for the bioengineers than for the philosophers.


Is that necessarily true on a very long timescale?

Is it getting easier or harder over time to make meaningful contributions to storytelling, science, mathematics, etc?


The heat death of the mind. It's possible I guess. But not in our lifetimes.


Depends - certain disciplines become tapped out, but if you think about the parameter spaces for something even as simple as music, it’s enormous. Skrillex only seems obvious once it’s part of the distribution of what we’ve experienced.


According to the Brave New World, the answer is drugs.


Don't forget the orgies


I still don't quite understand how BNW was a dystopian novel.


The more I learn about the modern food industry, the more I’m convinced we are already there.

I mean, what is a Snickers or a Big Mac ? Food ? Or a compound of the highest rewarding substances (sugar, fat and salt) for the human brain ?

(Also smartphones and social networks but I’ll stop there before writing a paper)


How many people do you know who wouldn't care about any of life's problems if they could get a Big Mac?


A lot. Me included. A lot of people eat caloric food to compensate for stress. It’s a very documented behavior and it’s just a different poison from cigarettes or alcohol to get some lacking dopamine.

Then it activates the same reward mechanisms in your brain and then you are stuck.

Yes it’s short time relief, but that’s no different than any other drug and this one is legal and cheap.


That's yes difference that it's short time relief. And you're already caring about the problem now. In fact, it's even possible that you can be caring about that because you've had a Big Mac to reduce your stress. In the context of this discussion, drugs mean something preventing you from doing this. The Big Mac is just junk food.


> ...working time has fallen greatly. In the rich world average weekly working hours have dropped from more than 60 in the late 19th century to fewer than 40 today.

Weekly working hours per job, sure... But reading about how many Americans seem to need to work two or more of those jobs in parallel, the article's thesis doesn't feel like the whole truth.


Stimulate our pleasure centers. Or rather, lay back as the machines stimulate our pleasure centers, and nourish us, and maintain our muscles and organs with electronic stimulus.

Heck I'm not even talking dirty here -- I'm talking about direct stimulus at the neural level.

Though for as bad as they were as movies this may get into some of the matter of what they touched on in the Matrix sequels. Humans actually want some degree of adversity.


We can evolve those desires away if we must.


I think that situation will not arise. Because the boundaries between humans and technology will blur.

We are already cyborgs. Sitting on bicycles, using a handheld computer and headphones.

After a few decades or centuries, there might be no organic cells left. We might be made completely from silicon. And look nothing like we do now. But we will continue to do what we always did. Work hard to optimize our survival and reproduction.


One thing that is unlikely to change, regardless of what happens: people will keep writing these speculative doomsday pieces (perhaps with the help of some generative AI).

“What if all the roombas decide to band together and unionise? That would be a disaster!”

When humanity gets wiped eventually, maybe the robots will keep the tradition going?

“Traces of human DNA discovered in Alaska! Any chance they could come back?”



How do you make these links?

Edit: actually, not working anyway.


I usually go to https://archive.is/, copy in the link I want to archive, submit, wait a minute, and then you receive the link to the archived article.


Humans are spectacular at creating their own systems for abusing and depriving themselves. No doubt we'll still be judging and killing each other for something or other. We might even have more time to decide on things we simply cannot stand the idea of someone else doing or being.


If we can automate anything ANNOYING we can do anything we like.

Some would do just science, innovation, because "solve everything" means nothing in practice, means just solving current problems as we know how to solve them in an automated fashion, some other will live like Virtual Revolution (movie) "connected life", some will have sex as much as their body allow etc.

A day we hope to became able to evolve after our body, substituting it with machines, to reach formal immortality, meaning "we live as long as we like, no suffering, no illness etc". That's an old dream of the humanity, ancient Greek's have called it "the Gold age".


Technology will solve the human problem as well


Learn and serve others. Develop virtues, and skills, and grow, and become better, fulfilling one's potential.

Just learning math or musical improvisation can fill a lifetime. Etc.

(I thought a great challenge would be the sport of starting with an empty field, just the clothes one is wearing, and access to wikipedia, and see how quickly one can develop technologies from survival up to smart phones, while trading only with others similarly engaged.)

The article suggests these things. But, so much more could be said. I believe our spirits are children of God, with infinite potential, and much to learn...


When technology has solved everything, humans no longer exist. They've been taken care of ;)


It won't. There will always be problems due to conflicts of interest between people.


> What will humans do if technology solves everything?

Immediately die of profound shock and embarrassment, on the discovery that a robotic hand made of cold titanium hyperalloy could be both willing and able to replace toilet paper.


We’ve had water hoses for ages but the TP manufactures still dominate ass cleaning. We may be overestimating the speed in which technology can devour culture.


That may rather have been my point. Technology will never solve everything… especially because possibly the sole actually reliable capability of human intellect is to take a problem for which there is a known, obvious, and workable solution and figure out a way to employ technology to make it demonstrably worse but also get someone else to pay for it.


As long as there is a need to make money to survive and live, technology doing more is actually a bad thing. It'll ultimately help the "owners" and crush the "doers".


Turn into technology.

But as others have mentioned, there will be no shortage of stuff to do, even if we end up immortal. That in itself would be a contradiction of the problem statement.


Solving everything means that technology will also solve problems of unemployment, the meaning of life, and all that.


We could just ask the technology for ideas?


Live in VR worlds where they can experience struggle, conflict and suffering if that's their kink.


I bet we will be very very busy. AI will expose so much value it will keep us running. Remember that any human will also have AI assistance, so we won't perform under AI level.


Get all stoic and meditate or something.


Become homeless except in FL.


Human wants are limitless.


The signal will be more and more depressive content stressing out young people who don’t know better, so they don’t have kids or worse. Then make healthcare bad - it’s expensive and if governments don’t do it right, nobody blames the government, but doing it wrong saves soo much money. Then put a strain on things humans need but machines don’t - food, mental health, time off, sunlight, recreation, childcare, education. This doesn’t have to be done by machines either - just successful but sociopathic humans with resources and automation that doesn’t question their motive or means. Because you can achieve scale fast through tech, anything damaging can be scaled quickly, so if communication between humans becomes impaired, then they can all get surprised by the same problem and not know how to respond. Finally you may see some specimens kept as pets to aid whatever is left unaided.


this is referred to as a contradiction


What is it contradicting?


I'm guessing, but I think the parent comment was meant to imply that "solver of all problems that doesn't solve the problem of "people having nothing to do"" is a contradiction.


Oh, I see. Isn’t that more of an oxymoron though?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: