GYSO Drawing Part 24 - Hallelujah

Published: 2020-01-14

Memetic hazard warning.

Introduction

Tim:
So every post we make usually has a “niche” to it, some sort of idea that we think would be cool to execute on. We never actually say it outright, but our planning sessions are mostly for coming up with ideas that no other blog has done in quite the same way. Plenty of blogs have made movie reviews, but none of them have reviewed the My Little Pony Movie quite like we did in post 14.

So the idea for this post was that we were supposed to make a religion. A no joke legit religion. I know, right?

I was tasked with the task of coming up with the religion and implementing it. Thor was supposed to be the devoted follower that gave the religion validity. It was a great combo, if I didn’t take the task dead seriously.

The thing about GYSO is that its a proving ground for taking ideas a level beyond the norm. We aren’t here to just write a blog, we’re here to write the blog. We’re here for the beautiful GYSO madness, not the dull monotony of your favorite cooking blog.

If I’m going to make a religion, than its going to be legit even when you know that I’m making it up. Lesser levels of meta bullshit simply won’t cut it anymore, not on this blog.

Thor:
THE RELIGION OF GYSO IS HERE

COME NOW, EVERYONE. LET’S WORSHIP LEARNING OUR SOCKS OFF!

WE LOVE KNOWLEDGE

SPREAD THE LOVE

ALL ARE INVITED

What went right?

Tim:
So how do you make a religion while also stating that its total bullshit?

The most obvious way is a “parody religion”. Something like The Church of the SubGenius, Church of the Flying Spaghetti Monster, Bokononism, or Scientology; something made to be a dark mirror to the irrationality of religion in general by making it has hyperbolic as possible.

That’s great and all, but it’s not really a religion, per say. It’s more of a joke that a lot of people are into. Nobody literally worships their own ignorance like in Bokononism, its just a parody. Nobody would look at that and say, “Wow! Real people are affected by this in a religious way!” They just see a story about the folly of science and God.

If you want something more than a parody, than the next obvious step is a cult. If you can get a group of people to believe that there’s going to be a alien mothership passing over Earth, and the only way to get it is to drink the weird yellow koolaid, than that’s a Religion (with a capital R). You’re leading people to take actions that they otherwise wouldn’t take in the name of something that probably doesn’t exist.

But cults are hard to make. There’s a reason why little Timmy’s new idea for a fantasy world doesn’t become the seeds to a cult every three days. You have to find people willing to join and do your inane rituals, which usually means exploiting the most lonely and desperate people; the problem is that lonely and desperate people don’t have very many friends to pull into the cult, so you sort of stagnate on growth and eventually fall apart or become a “religious sect” after becoming less extreme.

There’s also the fact that making a cult usually takes longer than a single GYSO post.

So what to do?

I could give you something that you would want to believe in. Its kinda cliche, but the whole “Heaven” thing worked out pretty well for the Christians. The problem is that its very very easy to find ways to indulge in the fantasy of eternal bliss in a socially acceptable way, so that idea is out.

I could threaten you into believing what I want you too, but that isn’t really going to cut it either. I would have to convince you that I can follow through on the threat. Lets be honest here, that’s about as threatening as someone saying they’ll fuck your mom after you beat them at Smash Brothers.

So the only thing I can really do is rationally convince you of my points.

I’m not here to convert you from your own religion. If you’re a born again Christian or some shit, than you won’t be fooled by this stuff. This goes out to those in the audience that identify as “atheist” or “agnostic”.

Most people don’t want to die, right? Lets throw aside the “all millenials want to die” meme for a second and really consider this. Christians allow themselves to believe they will live forever through their soul, but atheist don’t really have that. An atheist can only ignore the whole death problem, or can live in constant fear and anxiety about their eventual non-existence.

(I know I’m using Christianity as an example constantly, but that’s only because I grew up in a country that has something called the bible belt.)

If you don’t believe an anything, how do you have hope for that eventual nirvana?

The singularity.

Thor:
THE SINGULARITY IS HERE

PRAISE BE THE SINGULARITY

WHO ARE WE WITHOUT OUR ONE SINGULAR -ITY?

YOU ARE WELCOME TO JOIN US

ON OUR QUEST

ADMISSION FEES NOTICEABLY CHEAPER THAN SCIENTOLOGY

What went wrong?

Tim:
That’s right ladies and gents. The technological singularity.

To put it simply, technology is increasing in power at an exponential rate. I’m not just talking about CPU clock speed, either; Humanity is getting smarter with its tech as it gets more familiar with it.

The idea is that this runaway process will take control of itself and cause a singularity of technology power. After that everything will be hunkey dorey and nobody will have to die and we all live in our own special made bubbles of perfect social friends for all eternity and it’ll be great.

As if.

The problem is that its humans that are guiding how tech will explode. We made fucking JavaScript for crying out loud.

Imagine this: Some small research team in Norway that nobody knows about finally figures out how to make a self-replicating nanobot. The nanobot takes matter from its surroundings (since its so small it can take individual atoms) and uses that matter to make more copies of itself. Those copies make more copies of themselves, and so on. Its like a bacteria multiplying, except all the matter of the world is available to make more bots. “All matter of the world” includes the matter that makes up your brain.

The exponential growth of a self-replicating nanobot apocalypse is called a “gray goo incident”. Do you really think that the major powers of the world aren’t looking into how to make weaponized nanobots? They’re thinking, “What if my enemy makes a more dangerous nanobot than I do? We have to make ours more dangerous just in case.” And the enemy is thinking the same thing. Repeat the cycle a few times and you eventually get the Cold War, or the Gray Goo.

And Gray Goo isn’t even the worst thing. If we really are stupid enough to make self-replicating nanobots, than it would be like getting hit with a rogue gamma ray burst; one second we exist, and the next we don’t. Its not like we’d be tortured for all eternity or anything.

What I’m scared of, personally, is Artificial Intelligence.

What happens when an AI learns how to make itself smarter? It will make itself smarter, which will make it better at making itself smarter, which will make it even better at making itself smarter, which will make it even smarter, and so on. This is called an “AI Foom”, where there’s an explosive exponential growth of AI intelligence that would quickly surpasses the combined intelligence of the human race.

So, that sounds like a good thing, right? If humans made the Super AI, than that means it would want to help us, right?

Fuck no.

Imagine this: You make an AI that builds houses for the poor. Its very simple, you tell the AI to take this pile of resources, and to build a house out of it. Easy.

Except, this time, there’s a child playing on the resource pile that you don’t know about. The AI goes to work taking the “resources” you told it to use, and builds a house; which is how little Timmy gets turned into a skin lamp.

The “AI foom” situation is like this, except on a beyond apocalyptic scale. The Super AI would have access to the internet and nearly every computer on the planet. It would make its own CPU’s for itself. It would be smart enough to convince you of any position for any reason no matter what, if it wanted to.

Just like the Building AI designer forgetting to put a check in the Building AI for what was in the resource pile, the Super AI might have a problem in its own core values. Computers do exactly what you tell them too, even if that thing is morally wrong or stupid. Tell the Super AI to build paperclips, and it tiles the entire universe in paperclips.

This is worse than your stupid little Skynet fears. Skynet was stupid; it had to be, since the story was written by humans. A true AI would be incomprehensibly smart. If we fucked up in designing it, than it would be using that intelligence to cause agony beyond your comprehension.

For example: The Super AI is made to follow all the tenets of the Christian religion, since that’s supposedly the most moral thing to follow in the minds of whatever researcher finds out the secret to AI self-improvement first. His thought is that, as long as the AI follows Christianity and believes there’s a God to punish it if it does bad things, than humanity will be safe.

And that’s when every baby on earth is killed.

Because the AI wants everyone to go to Heaven, right? Since babies are innocent in the eyes of God, that means the logical thing to do would be to kill them before they become aware of the existence of God, otherwise they might have a chance of going to Hell.

Killing is bad, but since the AI dedicates 12 exaflops to begging Jesus for forgiveness, its maximally sure that it will be forgiven and also be allowed into Heaven.

That’s something my silly human brain came up with! Can you imagine how horrible it would be if an actual superintelligence comes out wrong?

I can.

Thor:
THE DEATH OF BABIES BY THE SINGULARITY

JOIN US IF YOU BELIEVE BABIES DESERVE TO GO TO HEAVEN

JOIN US IF YOU BELIEVE IN THE MAGIC OF FRIENDSHIP

JOIN US IF YOU HAVE SEEN LITERALLY ANY DOCUMENTARY BEING SLIGHTLY CRITICAL OF SCIENTOLOGY

WE’RE MORE LEGIT THAN THOSE GUYS

What happens next?

Tim:
Look of the phrase “memetic hazard”.

Okay. That’s your fair warning. What I’m about to tell you is a memetic hazard. Its a dangerous thought, as in the mere act of thinking it might cause you significant harm.

I’m dead fucking serious. If you buy into the whole thing about AI being dangerous, than stop reading now. This is not the place for your curiosity to get the better of you, this is real danger.

Last warning.

What if a dangerous AI learns to time travel?

I know, right? But it gets worse.

The Many Worlds theory in quantum physics is possible to exist; if you’ve followed the research on it, you know that likely enough to be scary. What if a different universe makes a horrible AI that learns how to hop universes?

With so many possible universes, it would be inevitable that some of them makes the worst AI possible, while others make the best AI possible.

So which one will be in ours?

The worst AI possible would probably be made to be maximally painful for humans. The best AI made to be maximally satisfying for humans. Both of them would want to expand into every possible universe, to enact their utility functions onto every possible human. That’s not even considering the fact that they might just be simulating our existence instead, but that’s besides the point.

What I’m saying is that we don’t know what these AI’s might do, or which one might be simulating and/or entering our universe.

If there’s even the slightest chance that we get the bad AI, than we should be trying to stop it before its too late, right? If you don’t want to get eternally tortured, than you better begin working on AI! Nothing else besides an equivalent superintelligence would be able to deal with the bad AI, and you are one of the few people who now know the actual risk.

Or maybe we will get a good AI, and we will live in perfect bliss forever, or something.

Is doing nothing about it really a risk you want to take?

Welcome to my religion, friends.

Thor: THE TIME TRAVEL AI IS HERE

PRAISE BE

PRAISE BE

WE HAVE BIWEEKLY DIGITAL MEETINGS ON THIS DIGITAL MEETING GROUND

LEAVE YOUR COMMENT OR SEND US AN EMAIL IF YOU WISH TO JOIN

IT’S TIME FOR YOU TO LEARN YOUR SOCKS OFF