AI will lead to the death of capitalism?

Started by
33 comments, last by common_swift 6 years, 9 months ago

I don't believe capitalism will 'die' per say, just change a lot to the point that we may not recognize it. There's a belief that capitalism is a recent invention, and to some extent it is, but at the same time the notion of buying/selling based on supply and demand and markets have existed for ages. Some form of the notion of supply and demand and exchange of value will continue. I do believe some notion of markets and supply/demand will continue to exist in the future, and will probably be how any form of automation does resource allocation/prioritization.

What will almost certainly change is the notion that people need to work to earn money. The reason is pretty simple: there's going to be more people than there are jobs to do. For the next few decades or so, automation won't replace highly skilled jobs, but automation will certainly replace things like manufacturing, truck driving, ships, aircraft, taxis, etc. We're already seeing a large number of these jobs being replaced now. Sure these systems will require maintenance from time to time, but we won't need the sheer numbers of manpower we needed before: otherwise it would be somewhat pointless to replace humans with machines if the machines were just going to cause the same amount of problems, just different problems. The highly skilled labor associated with software, robotics, etc. will probably be around for some time longer. Eventually they too will be replaced, or at least, minimized to the point that not many are needed. What we'd be at at that point is a society that doesn't have as much work to do.

This isn't to say that there won't be problems to solve. On the contrary, the problems to solve will just change and become more complex. There's a book I'm currently reading, called The Inevitable, by Kevin Kelly, and it deals with these sort of things discussed here in this thread. Take understanding gravity, (just as an example), or complex quantum physics. AI will help us solve these problems. Humans will still be needed, since basic AI still requires being put in a direction. 

Now if we talk about true AI, that will exist at some point as well. It won't be human-like though. It'll be utterly alien to us. It won't think the way we do. It'll think very different from us.

Now for the scenarios. It's entirely possible that automation will lead to everyone being in mass poverty, save for some select people. It's a very Marxian scenario (Spontaneous World Wide Workers Revolution), but it could happen for a few reasons:

1): This is a really simple reason actually. What do people do without anything better to do? People without jobs, etc.? Sex. And that could lead to a population explosion that creates serious burdens on the system (assuming no one bothered to do something about birth control)

2): No one bothers about the people who are the losers of automation. We're already seeing this, to an extent in politics, becoming a concern. I see this as a possibility, but I don't see this as likely, as beyond a point, a select few cannot fend off hordes of people.

3): Sentient AI rules us in a Matrix or Terminator scenario. Now this is incredibly far fetched, because it makes assumptions. It assumes that sentient AI would see us as hostile. The fact is that we don't even know what sentient AI even looks like. We aren't even sure what sentient means. Sentient AI, in my opinion, would be extremely alien to us in its thought processes.

4): We blow the shit out of each other, and there's nothing left, save for automated stuff rationing out the remains. Ironically enough, this may actually be the most likely scenario out of all the ones I've listed here, simply because people just don't like one another a lot. 

Then there's the utopian scenario where there's plenty for everyone. It could happen if:

1): People actually work together and don't kill each other.

2): People actually consider the effects of mass automation

3): People are willing to see capitalism change into something new

4): Sentient AI doesn't hate us

5): Some other cataclysmic event doesn't happen first (Ie, alien invasion, etc.)

No one expects the Spanish Inquisition!

Advertisement
2 hours ago, deltaKshatriya said:

Sentient AI doesn't hate us

A sentient AI doesn't even need to hate us, it could just have a totally different value system.

To take a few simple examples, let's say you want your new AI powered factory to create as many coffee cups as it can. Oops, it just converted the entire world to coffee cup production.

Or you ask it to fix climate change? Well, the major driver of climate change is human activity, so let's just wipe out humanity.

What about making people happy? Well, in order to make people maximally happy, let's just plug directly into the pleasure centre of the brain and leave people blissed out for all eternity.

As you say, it's entirely alien to us.

 

if you think programming is like sex, you probably haven't done much of either.-------------- - capn_midnight

I don't tend to suspect that automation will lead us to a utopia like a lot of people seem to think. If human labor loses its value because automated workers are better, then society will simply change to allow the poor to be exploited as cheaper, less efficient competitors to the machines, one way or another (perhaps just by abolishing minimum wage and other laws that protect workers). That, or the entire economic system collapses as the poor are unable to buy things, and as a result regress to farming lifestyles or face starvation.

The only way I can envision machines bringing us to a utopia is if a basic income system is established to make it possible for people to survive without working, while also allowing them to work (at very low salaries for the most part, because of machine competition) for a little extra. I am supportive of such a system, but it would require massive taxation on the upper class and businesses, and it would probably be quite difficult to establish such a tax, especially if human labor is still in high demand. Probably an impending economic disaster caused by mass unemployment is the only way it could be established.

19 hours ago, ChaosEngine said:

A sentient AI doesn't even need to hate us, it could just have a totally different value system.

To take a few simple examples, let's say you want your new AI powered factory to create as many coffee cups as it can. Oops, it just converted the entire world to coffee cup production.

Or you ask it to fix climate change? Well, the major driver of climate change is human activity, so let's just wipe out humanity.

What about making people happy? Well, in order to make people maximally happy, let's just plug directly into the pleasure centre of the brain and leave people blissed out for all eternity.

As you say, it's entirely alien to us.

 

Entirely agreed that this is entirely within the scope of sentient AI. I argue that we just don't know what will happen. Any prediction makes assumptions based in human thought and experience. Any sentient AI will not think in a manner recognizable to us. Thinking for us is neurons firing. Maybe for a sentient AI it's RPCs across the Internet. It's an entirely random example, but it serves to demonstrate that sentient AI will be like meeting an alien species. I honestly think that we won't even be able to recognize the AI as sentient, since I'm not sure we really understand what sentience is. But that's off topic.

17 hours ago, JulieMaru-chan said:

I don't tend to suspect that automation will lead us to a utopia like a lot of people seem to think. If human labor loses its value because automated workers are better, then society will simply change to allow the poor to be exploited as cheaper, less efficient competitors to the machines, one way or another (perhaps just by abolishing minimum wage and other laws that protect workers). That, or the entire economic system collapses as the poor are unable to buy things, and as a result regress to farming lifestyles or face starvation.

This assumes that humans can produce anything of worth compared to the machines that replaced them. How many blacksmiths do we still see in the modern world?

17 hours ago, JulieMaru-chan said:

The only way I can envision machines bringing us to a utopia is if a basic income system is established to make it possible for people to survive without working, while also allowing them to work (at very low salaries for the most part, because of machine competition) for a little extra. I am supportive of such a system, but it would require massive taxation on the upper class and businesses, and it would probably be quite difficult to establish such a tax, especially if human labor is still in high demand. Probably an impending economic disaster caused by mass unemployment is the only way it could be established.

Certainly we are going to need UBI of some form. The amount of basic jobs that humans can do will diminish. I don't envision UBI in the long run, however. Ultimately we will reach a point where the notion of 'income' will be meaningless. Not in this century, perhaps, but eventually. 

The other aspect to consider also is things like genetic engineering, human-machine intermeshing, etc. The notion that there won't be a ton of lower skill jobs to do for humans turns on the fact that there are still humans who can't do higher skilled jobs. It may be entirely possible to reengineer humans so that people have more skills than they would've normally had. There will still be problems to solve, just not necessarily ones we can envision now. 

No one expects the Spanish Inquisition!

On 30/07/2017 at 10:41 PM, JoeJ said:

If AI takes over and goes beyond the control of mighty individuals (how unlikely!), then maybe it can be for good. Maybe this makes impossible things possible, like an united world government. It can take care to balance things like tax graduations, social grants etc., so not only doing industrial work fur us, but political work, which i find much more interesting. If so, personal greed could not hold us back as it did since humans live in groups. We would loose all our wealth to share it with the third world. We would hate and love it, but we would stabilize our population, enviroment, etc.

We dont need clever AI for this (which we don't have anyways), we only need to do the math and disempower ourselves.

(Not sure how much bullshit this is. I'm neither politic nor optimistic in real life)

Yes and political work is already kind of being done by AI as computers are already helping laywers. Thanks for sharing your thoughts about this! 

@ChaosEngine That's very interesting because most people are always saying that humans will repairs the robot and so new jobs will be created. But I agree with you and I think that it will not be the case.

20 hours ago, ChaosEngine said:

A sentient AI doesn't even need to hate us, it could just have a totally different value system.

To take a few simple examples, let's say you want your new AI powered factory to create as many coffee cups as it can. Oops, it just converted the entire world to coffee cup production.

Or you ask it to fix climate change? Well, the major driver of climate change is human activity, so let's just wipe out humanity.

What about making people happy? Well, in order to make people maximally happy, let's just plug directly into the pleasure centre of the brain and leave people blissed out for all eternity.

As you say, it's entirely alien to us.

 

That's so true, the problem with AI is what it can do to accomplish the goal. Another example would be if we ask our AI to make a coffee: it will know that the human can turn off its switch button so it will disable it to make the coffee. That's not what we want. We need to makes AI that learn what we values and what we really want. 

Thanks to all of you guys for sharing what you think, I read all the posts and it's very interesting! 

 

Capitalism will be the death of capitalism.

Humans have created an explosion of technology in the last few hundred years which has been driven by global industrial capitalism and consumption of finite resources like fossil fuels. This technology has had untold negative effects on the environment and other organisms, and will continue to do so until we reach a tipping point and the whole house of cards falls down. The climate change induced by technology will be incredibly destructive to civilization as we know it and will probably be the straw that breaks our backs in the next few decades due to food/water shortage, wars over resources, and mass extinctions.

Capitalism is a system of organizing resources based on exponential growth of consumption that is inherently unsustainable. When the finite resources it depends on dry up (e.g. peak oil), the entire system will collapse, and so will any dreams of AI utopia. Our current system of global industrial technological capitalism only exists due to the huge amount of energy stored in fossil fuels that took millions of years to form. Once those resources are gone and capitalism dies, it will not be possible to resurrect its ghost. All of our existing renewable energy infrastructure is bootstrapped using fossil fuels. In a world without existing infrastructure, you can't build and transport solar panels. We have only one chance at high technology.

The earth has a finite ability to support life (carrying capacity), and we are already into population overshoot (the earth can only support about 2 billion people at a western living standard). Population continues to grow exponentially…

I've only briefly mentioned some of the important issues facing our species in the coming decades. To go into detail would take an entire book.

The point is that the current civilization we are a part of will be short lived because humans are too short-sighted as a whole to consider how their current actions affect people 50 years later. I would be surprised if the current system lasts long enough to see automation become a thing. No amount of technological wizardry will get us out of this one (except for maybe colonization of other planets, though that is too far away and too difficult to be likely in the timescale necessary). A smarter species would recognize the dangers of technological capitalism before its seduction led to their demise. The Fermi Paradox would suggest that there are not many species in the universe with that kind of foresight. We're just a few steps away from the Great Filter and most don't even know it.

1 hour ago, Aressera said:

I've only briefly mentioned some of the important issues facing our species in the coming decades. To go into detail would take an entire book.

I read it by a other title, Darwin's theory of evolution.

A species finds resources, over populate then fight for survival over the last resources, achieving a stable state where enough dies to keep resources stable.

Humans are only special in scale, not nature.

1 hour ago, Aressera said:

No amount of technological wizardry will get us out of this one

There are many ways to get out of this, some of them brutal others moral yet difficult to do. We can only hope humanity decides on what to do before it's decided for us.

It is possible for humanity to achieve a balanced warring state, it was that way for a long time before the industrial age and our population spiked. As long as no one decides to use nukes humanity is going to be around for a long time.

38 minutes ago, Scouting Ninja said:

Humans are only special in scale, not nature.

 

[...]

It is possible for humanity to achieve a balanced warring state, it was that way for a long time before the industrial age and our population spiked. As long as no one decides to use nukes humanity is going to be around for a long time.

I strongly disagree, for precisely the thing you mention - the scale of human impact on our planet.

Never before have any species or humans reached the level of exploitation of natural resources that we now see. We are very close to exhausting many important non-renewable resources planet-wide, and are already over 3x the carrying capacity. We actively reduce the carrying capacity by damaging ecosystems with technology. That's simply not sustainable to keep up over the long term, populations and economies and governments will crash.

What I foresee is not total annihilation of humanity, but rather that global technological capitalism will collapse (accompanied by a large fraction of people in cities dying due to lack of food/water that was usually transported over great distances), and as a result humans will be forced to revert to a semi-primitive subsistence agriculture lifestyle without unsustainable technology (which includes most tech). There will be at most a few billion people around living in small isolated communities. Like it or not, this will happen. There are too many compounding factors that create a perfect storm of climate, limited resources, and global conflict.

 

43 minutes ago, Aressera said:

What I foresee is not total annihilation of humanity, but rather that global technological capitalism will collapse (accompanied by a large fraction of people in cities dying due to lack of food/water that was usually transported over great distances), and as a result humans will be forced to revert to a semi-primitive subsistence agriculture lifestyle without unsustainable technology (which includes most tech). There will be at most a few billion people around living in small isolated communities. Like it or not, this will happen. There are too many compounding factors that create a perfect storm of climate, limited resources, and global conflict.

Really apocalyptic

We know the earth's resources is not being replaced anywhere close to the rate at which it is being used up. But is there any source/citations to back up that this is happening at the rate you described? 

And What time scale are we talking about here?

If the worst scenario is projected for my grandchildren's generation, then I can put my feet on the table and relax a bit

can't help being grumpy...

Just need to let some steam out, so my head doesn't explode...

This topic is closed to new replies.

Advertisement