• We have updated the guidelines regarding posting political content: please see the stickied thread on Website Issues.

The Doomsday / Apocalypse Thread

Billionaire Preppers.

Tech billionaires are buying up luxurious bunkers and hiring military security to survive a societal collapse they helped create, but like everything they do, it has unintended consequences
by Douglas Rushkoff

As a humanist who writes about the impact of digital technology on our lives, I am often mistaken for a futurist. The people most interested in hiring me for my opinions about technology are usually less concerned with building tools that help people live better lives in the present than they are in identifying the Next Big Thing through which to dominate them in the future. I don’t usually respond to their inquiries. Why help these guys ruin what’s left of the internet, much less civilisation?

Still, sometimes a combination of morbid curiosity and cold hard cash is enough to get me on a stage in front of the tech elite, where I try to talk some sense into them about how their businesses are affecting our lives out here in the real world. That’s how I found myself accepting an invitation to address a group mysteriously described as “ultra-wealthy stakeholders”, out in the middle of the desert.

They started out innocuously and predictably enough. Bitcoin or ethereum? Virtual reality or augmented reality? Who will get quantum computing first, China or Google? Eventually, they edged into their real topic of concern: New Zealand or Alaska? Which region would be less affected by the coming climate crisis? It only got worse from there. Which was the greater threat: global warming or biological warfare? How long should one plan to be able to survive with no outside help? Should a shelter have its own air supply? What was the likelihood of groundwater contamination? Finally, the CEO of a brokerage house explained that he had nearly completed building his own underground bunker system, and asked: “How do I maintain authority over my security force after the event?” The event. That was their euphemism for the environmental collapse, social unrest, nuclear explosion, solar storm, unstoppable virus, or malicious computer hack that takes everything down.

This single question occupied us for the rest of the hour. They knew armed guards would be required to protect their compounds from raiders as well as angry mobs. One had already secured a dozen Navy Seals to make their way to his compound if he gave them the right cue. But how would he pay the guards once even his crypto was worthless? What would stop the guards from eventually choosing their own leader?

The billionaires considered using special combination locks on the food supply that only they knew. Or making guards wear disciplinary collars of some kind in return for their survival. Or maybe building robots to serve as guards and workers – if that technology could be developed “in time”. ...

https://www.theguardian.com/news/20...-bunkers-apocalypse-survival-richest-rushkoff
 
Billionaire Preppers.

Tech billionaires are buying up luxurious bunkers and hiring military security to survive a societal collapse they helped create, but like everything they do, it has unintended consequences
by Douglas Rushkoff


As a humanist who writes about the impact of digital technology on our lives, I am often mistaken for a futurist. The people most interested in hiring me for my opinions about technology are usually less concerned with building tools that help people live better lives in the present than they are in identifying the Next Big Thing through which to dominate them in the future. I don’t usually respond to their inquiries. Why help these guys ruin what’s left of the internet, much less civilisation?

Still, sometimes a combination of morbid curiosity and cold hard cash is enough to get me on a stage in front of the tech elite, where I try to talk some sense into them about how their businesses are affecting our lives out here in the real world. That’s how I found myself accepting an invitation to address a group mysteriously described as “ultra-wealthy stakeholders”, out in the middle of the desert.

They started out innocuously and predictably enough. Bitcoin or ethereum? Virtual reality or augmented reality? Who will get quantum computing first, China or Google? Eventually, they edged into their real topic of concern: New Zealand or Alaska? Which region would be less affected by the coming climate crisis? It only got worse from there. Which was the greater threat: global warming or biological warfare? How long should one plan to be able to survive with no outside help? Should a shelter have its own air supply? What was the likelihood of groundwater contamination? Finally, the CEO of a brokerage house explained that he had nearly completed building his own underground bunker system, and asked: “How do I maintain authority over my security force after the event?” The event. That was their euphemism for the environmental collapse, social unrest, nuclear explosion, solar storm, unstoppable virus, or malicious computer hack that takes everything down.

This single question occupied us for the rest of the hour. They knew armed guards would be required to protect their compounds from raiders as well as angry mobs. One had already secured a dozen Navy Seals to make their way to his compound if he gave them the right cue. But how would he pay the guards once even his crypto was worthless? What would stop the guards from eventually choosing their own leader?

The billionaires considered using special combination locks on the food supply that only they knew. Or making guards wear disciplinary collars of some kind in return for their survival. Or maybe building robots to serve as guards and workers – if that technology could be developed “in time”. ...

https://www.theguardian.com/news/20...-bunkers-apocalypse-survival-richest-rushkoff
Yeah, once 'the event' occurs we're back to good old 'last man standing'...
 
My worry is that when they're ready, they will engineer that collapse so they can then rebuild in their own image.
How? The standing of those purported to be engineering such is directly linked the populace at at large providing them with income and status. Once the population is reduced, by whatever means, then the source of power has also been reduced, as is the buying power of 'billions'. There a limit to what can be actually achieved here - for every 'tech billionaire' there's a billion people who would hang them from a lamp-post.
 
It's been my thought for years now that we regular working class people are nothing more than expendable pawns, being moved around by those in charge.
It's not paranoia or pessimism, it's from watching events as they unfold.
And I find it particularly unnerving that my Grandfather's warning from many years ago is exactly what we have been living through in the last few years. He was living near the Russian / Chinese border, not by his choice, and said that trouble would originate there.
 
How? The standing of those purported to be engineering such is directly linked the populace at at large providing them with income and status. Once the population is reduced, by whatever means, then the source of power has also been reduced, as is the buying power of 'billions'. There a limit to what can be actually achieved here - for every 'tech billionaire' there's a billion people who would hang them from a lamp-post.
Yep. If an elite class kills off the general population, most of what they own would become worthless.
Defending the stuff that does have a value would be really difficult. There'd be complete anarchy and no law enforcement.
An 'elite' class that did this deliberately would be self-selecting themselves for a nasty death.
 
I think this fits in here.

The Art of the Apocalypse: Dr Scotty McQueen on A.I., Unfiction, and Ethical Interventions

There is an undeniable appeal to apocalyptic narratives: the proliferation in popular culture in the age of digital technology suggests that they retain the hold over audiences that they always have. Yet, the messages of these narratives are often far from constructive. How can we approach these narratives ethically? Can we draw useful lessons from them?

These are some of the questions Dr Scotty McQueen explores in his 21-part creative practice YouTube series, The Art of the Apocalypse: From A.I. to Zombie, which encourages viewers to redirect their relationships with narratives of the apocalypse and, indeed, digital media, in more productive and healthy directions. A behavioural insights and personal development specialist, McQueen currently holds an Early Career Research Residency with the Trinity Long Room Hub Arts and Humanities Research Institute, working on digital media in the School of Creative Arts. In an interview with Trinity’s HUMAN+ programme and the Office of the Dean of Research, Dr McQueen discusses the creative process and ethical decision-making around A.I. that is explored in The Art of the Apocalypse.

McQueen quasi-roleplays as an apocalypse survivalist: the self-proclaimed “Crazy Conspiracy Dude,” who guides viewers through apocalypse-themed workshops for survival-based activities, including disaster preparedness, self-defence, wilderness survival, and sustainable digital practices. The YouTube series includes a section on the A.I. apocalypse, which Crazy Conspiracy Dude describes as “a three-part series, designed around the trinity of human badassness: the mind, the body and the spirit.” He goes on to explain that “for the moment, it is still possible to take advantage of digital technologies without letting digital technologies take advantage of you.”

These videos explore concepts from the fields of philosophy, psychology and physics among others, incorporating thought experiments, magic tricks, and reflections from writers and thinkers throughout history. This enables McQueen’s avatar to demonstrate strategies of awareness regarding the influence digital technology has on our lives. After all, he points out, if tasked with creating an A.I. that could manipulate humans into destroying their own civilization, “you might have a hard time coming up with a more efficient solution than the engagement-maximizing algorithms already controlling your own social media feeds.” ...

https://www.tcd.ie/research/researchmatters/mcqueen.php
 
How doomed are we? Is the end nigh or do we have the horse sense to say neigh?

In 2020, Oxford-based philosopher Toby Ord published a book called The Precipice about the risk of human extinction. He put the chances of "existential catastrophe" for our species during the next century at one in six.

It's quite a specific number, and an alarming one. The claim drew headlines at the time, and has been influential since – most recently brought up by Australian politician Andrew Leigh in a speech in Melbourne.

It's hard to disagree with the idea we face troubling prospects over the coming decades, from climate change, nuclear weapons and bio-engineered pathogens (all big issues in my view), to rogue AI and large asteroids (which I would see as less concerning).

But what about that number? Where does it come from? And what does it really mean?

To answer those questions, we have to answer another first: what is probability?


The most traditional view of probability is called frequentism, and derives its name from its heritage in games of dice and cards. On this view, we know there is a one in six chance a fair die will come up with a three (for example) by observing the frequency of threes in a large number of rolls.

Or consider the more complicated case of weather forecasts. What does it mean when a weatherperson tells us there is a one in six (or 17%) chance of rain tomorrow? It's hard to believe the weatherperson means us to imagine a large collection of "tomorrows", of which some proportion will experience precipitation. Instead, we need to look at a large number of such predictions and see what happened after them.

If the forecaster is good at their job, we should see that when they said "one in six chance of rain tomorrow", it did in fact rain on the following day one time in every six.


So, traditional probability depends on observations and procedure. To calculate it, we need to have a collection of repeated events on which to base our estimate. So what does this mean for the probability of human extinction? Well, such an event would be a one-off: after it happened, there would be no room for repeats.


Instead, we might find some parallel events to learn from. Indeed, in Ord's book, he discusses a number of potential extinction events, some of which can potentially be examined in light of a history. ...

https://www.sciencealert.com/does-humanity-really-face-a-1-in-6-chance-of-dying-this-century
 
I personally, if I could, go back in time and tell J. Robert Oppenheimer that he is responsible for destroying the planet.

I am pessimistic about the future of our planet.
It's OK, the planet is going to be just fine.
It's the humans who will die eventually.
 
I wonder how many people the earth can accommodate in the future, maybe 10 billion ?
 
We'll soon find out I guess.

The 20th century was, by some distance, the bloodiest in humankind's history.
The 21st doesn't seem to be faring much better.
Could the common denominator be the vast and unsustainable increase in human population over the last 100 or so years?
It was in terms of scale, but not in terms of motivation. Greed is and always has been at the heart of most conflict. Greed for power. Greed for wealth. Greed for land. Greed for love. Greed for attention. Greed for vengeance. Greed for security. A content heart / mind seeks not war.

 
Last edited:
Thou shalt not imminentize the eschaton :)
https://en.wikipedia.org/wiki/Immanentize_the_eschaton
1708118518212.png
 
Too late! Eschaton already immanentized!
 
Back
Top