25 Comments

I just turned 35, so my fears of AI and technology have always naturally assumed the shape of various Terminators and the squid-monsters from the Matrix. It's taken some maturity and deep thinking to get past these fantasies.

I agree the the takeover of humanity with generative AI is a technological, material, and energetic impossibility at the current moment. However, the potential of autonomous killing machines is still plenty terrifying. We see how drones and cost-effective missles are changing the face of the battlefield in Ukraine and Israel. The proliferation of autonomous weapon systems continues across the globe, with Turkey reportedly scoring the first fully autonomous drone kill five years ago. And the cute robot dogs that everyone loved to see dance a few years back now have sniper rifles attached to them and are killing civilians in Gaza.

While being ruled by generative AI is impossible, the kind of artificial intelligence that drives the navigation and target aquisition of humanity's killing machines is quite mature. And our human and corporate overlords seem quite eager to use them. As energy supplies dwindle, I would expect to see generative AI abandoned. However, the genie is out of the bottle already when it comes to autonomous weapon systems, and it seems energy and materials will be diverted to their use as long as civilization exists.

Suppose in a hundred years, only three megacities run by hydropower still exist. I would still expect to see drones patrolling the skies above them to better surveil and control the population.

Expand full comment

The robot dogs with rifles was literally a Black Mirror episode that has come all to scarily true but yes to pretty much all of this.

Expand full comment

If the purpose of AI is to improve economic efficiency, then why is it consuming MORE energy? If it increased efficiency, wouldn't energy consumption be going down and not up? What am I missing here?

Expand full comment

Because these are two seperate energy circuits, for one.

Fx, lets suppose AI is used to make a car manufacturing plant more energy efficient, it can eg design the conveyors more effectively, cut the materials so there is less waste, shaving off a few % here and there.

But the AI system itself is also using greater amounts of energy - greater than it saves in the car plant.

This is not uncommon in compartmentalised systems with different budgets.

Expand full comment

It sounds like you are just supporting my point. Overall we are getting less efficiency by using AI. AI uses more energy than it saves.

Expand full comment

Wait until you listen to Vandana Shiva talk about lab grown meat. The tech companies who are trying to create lab meat are selling the idea to investors and vegetarians "as a humane alternative" aren't telling the general public that it takes 1000x the energy to create a pound of meat. Add in the additional data centers / Ai energy consumption mentioned above and Opps.

Expand full comment

I would answer that society doesn't have a "purpose," per se—more like a modus operandi. As laid out by Joseph Tainter, the natural response of a society to any problem is a permanent increase in societal complexity, with the assumption that the costs of complexity will be made up by increased efficiency. Societies collapse when additional complexity has a neutral or even negative cost to efficiency.

So, we are investing in AI because we collectively assume it's going to help us solve more problems than it causes. Bill Gates said that AI could assist with global warming and food production last year, for instance. But instead, it's used up a lot of water that could have been used for agriculture, and it's released tons of carbon into the air from electricity consumption. In addition, AI seems on the verge of causing more social problems than it solves.

This is why many observers such as Indrajit Samarajiva argue that AI and other technologies that are supposed to stave off environmental collapse and overshoot are actually accelerating the process. https://indi.ca/ai-is-a-sign-of-collapse/

Samarajiva is a particularly interesting read because he aligns himself with the Islamic Resistance in the Middle East. If you read the ideological documents of some Islamic Resistance Groups, such as The Management of Savagery, you can see that they have weaponized Tainters ideas to use against the United States for decades.

Expand full comment

I'll take that as agreement with my basic point. And the answer to the question "why?" is that ruling elites are insane.

Expand full comment

Exactly right 👍

Expand full comment

"The Management of Savagery, you can see that they have weaponized Tainters ideas to use against the United States for decades."

Examples?

The Management of Savagery = 2004. Never heard of it before just now. Is that what they used to make the Americans fail in Iraq, Afghanistan & Africa over the last 20 years? Is that their excuse for their monumental failure against the sandal wearers?

Expand full comment

I read the The Management of Savagery shortly after Tainter, and I could see his influence on several parts, especially in discussions of the collapse of Ottoman Empire (or rather, it's failure to collapse). Esentially, the book is about a plan to create conditions for a true social collapse in the Middle East so that people would be willing to be ruled by a Caliphate. The plan for doing so was conducting a series of guerilla attacks to draw US military strength in to the reason and exhaust them. Given that the surge and drawdown of the US forces in Iraw eventually led to ISIS, I would say that the inital phases of their plan worked beautifully.

The mot interesting part of the book, though, is describing how Resistance groups are fighting a long-term economic war against the west. They describe, for instance, how when they bomb one western hotel, every hotel in the world is forced to beef up security, increasing costs. Indeed, they seem very aware that terrorism forces unwanted social complexity on their foes (such as the USA creating the useless Department of Homeland Security, and example which Tainter himself cited of inefffective social complexity) and is an essential part of their grand strategy.

Expand full comment

Jevons Paradox! here's a long video explaining it, but the prof is great:

https://www.youtube.com/watch?v=SM8pQmA7wos

Expand full comment

I skipped to the section where they explain "Jevon's Paradox." It seems to me that it is primarily a function of capitalism. I don't know if it would necessarily hold in a socialist society. Capitalism requires continuous growth which requires the continuous creation of new artificial "needs." Eliminate that factor and you might eliminate the phenomenon where efficiency leads to more consumption.

Expand full comment

maybe, but I would suggest that every society has an incentive to grow -if it can- for the sake of geopolitical influence. conquer or be conquered. that's why almost every "primitive" society has been wiped out (displaced, absorbed, etc) and "civilization" (includes both developED and developING countries) has spread worldwide

Expand full comment

Good point.

Expand full comment

"And no, artificial intelligence will not eat us alive for our atoms either, as it cannot hope to build itself up from carbon, oxygen, nitrogen, hydrogen and calcium — materials our bodies consist of."

You're no fun. :(

Expand full comment

Douglas Rushkoff has talked pretty regularly on his Team Human podcast about how AI CEOs, founders, and funders pushed this idea of a globally threatening AI myth to hype the perceived danger thus potential of AI and spur investment.

I’m not particularly an optimist here but a recent NyTimes article just showed state by state total energy generation and how quickly it has changed since 2001. Idaho and Kansas for instance now both produce roughly 50% of their total energy from wind which was next to nothing in 2001. As low population states this may not mean much but there is clearly lots of room for growth here.

Personally I’d like to see mass swaths land/prairie in the central US returned for wild bison habitat/migration and use optimal wind/currents to select the land so that this contiguous mass of land could also be used for wind energy generation. Allow First Nations stewardship of the land and herd to maintain and cull the herd for food. I’m sure George Monbiot would scream at this but oh well.

I’m aware the concentrated energy generated from wind is not comparable to oil and am concerned about a massive energy crunch on the horizon however fusion is potentially 20-50 years off. This is the race and not one I’ve heard many discuss in realistic detail. Even without “generative AI” will AI be able to speed up fusion development? Again I’ve not heard an answer here. Even with an energy crunch there’s likely not a scenario where governments don’t use available energy to continue developing fusion so if/how long there’s an energy crunch is unknown but back to Idaho and Kansas this change all happened without the real effects of climate change pushing development to move more rapidly.

In this scenario I can’t help but think the real concern comes from Jaron Lanier’s take on AI. That Generative AI and AI taking over the world is not an issue but AI being used as a commercial tool, for fraud, hacking, and for chaos against government’s might very likely drive us all crazy.

Expand full comment

Gail Tverberg (OurFiniteWorld.com) is fond of saying that windmills are powered by a continuous stream of diesel trucks, bringing spare parts.

I used to imagine this meant an occasional UPS truck bringing a circuit board.

Then, I drove across North Dakota on US 10.

I was impressed with the massive number of wind turbines along that highway. "Boy, they're really solving the energy problem!" I thought.

Then, I noticed something I had not anticipated.

Every thirty seconds or so (I actually timed it!) I passed a huge, thirty-wheel (I actually counted!) semi-truck/trailer going the other direction. They were each carrying a small portion of a wind turbine. One truck per blade. Three trucks per disassembled column. Six trucks per replaced turbine.

I can only imagine that an equal number of trucks extended ahead and behind me, carrying new replacement turbines.

This means that the wind turbine arrives (from where) at the cost of two gallons of diesel fuel per mile travelled to get there! (Six mpg per truck, divided by a total of twelve trucks.)

Art Berman (et. al.) think that, as fossil sunlight goes into decline, countries that currently have an excess will need more and more of that excess for their own use. They estimate that "the end of diesel exports" will happen in 2027 or 2028.

At that point, the US is going to be in a world of hurt.

Yea, the US is currently the biggest petroleum producer, thanks to fracking, which appears to be pretty unique to North America. (Saudi Arabia won't be able to frack Gawhar when they can no longer pump it. The geology is all wrong.)

But what you don't often hear, is that fracked oil is overwhelmingly *light* oil, containing perhaps 50% less diesel than previous heavy crude contained. Thus, while the US produces a higher *volume* of oil than anyone else, that oil contains much less energy than it did just a decade or so ago.

The world runs on diesel. For mining, long-haul trucking, trans-oceanic shipping, and most importantly, agriculture. *None* of these are easily electrified to run on so-called "renewable" energy.

In a few years, the US may have to choose between feeding people, or mining, or shipping products. That's going to be a tough choice.

My bet is they will cowardly "let the market decide," which means poor people will starve while the cost of manufactured goods goes through the roof.

Interesting times ahead! Hope you're growing food! (If not, start TODAY!)

Expand full comment

It won't be a tough choice for the US ruling class

Expand full comment

Thanks for the article. I think in a few places you have used generative AI when you mean Artificial General Intelligence (AGI).

We have lots of generative AI now, Claude, ChatGPT, etc. We don’t have AGI yet, I’m skeptical that we will. We seem to hit these last mile problems, self driving cars defeated by the last 5%, which in itself is a complexity problem.

Expand full comment

Glad to see you've discovered Andrew Nikiforuk! He alone is a good reason to keep an RSS feed on TheTyee.ca.

Expand full comment

Thanks for restating the obvious-to-some-of-us reality again, B.

I'll excerpt this post in my bog again, because folks still need reminding.

Expand full comment

B, reaching out again to see if you are interested in participating in a writing project. Please let me know and we can find a means to communicate privately.

Steve

Expand full comment

The humans do not need AI to destroy themselves. They have already triggered their own destruction.

Here's a link to an article featuring a string of short, embedded videos showing the consequences of 1 day of climate change Jacked flooding all over the planet.

.

**24 Hours of Liquid Death**

.

Floods from the past day alone

.

https://www.collapse2050.com/24-hours-of-liquid-death/

Warming of the globe and the moisture on it thus causing increased evaporation thus causing increased amounts of precipitation, was one of the easier/no brainer global warming predictions that has come to pass by the bucketful.

Expand full comment

Nice to see Tainter and Doctorow referenced in the same post.

Expand full comment