WALRUS: Elon Musk says we're probably living in a simulation.
TURTLE: Yeah? Hm. One of those Zen types thought he was living in a butterfly's dream, didn't he? Anyway, isn't Mush the guy who wants to colonize Mars?
W: Yeah.
T: I think Stephen Hawking also believes we're in a simulation, doesn't he?
W: I'm not sure. Mawking does think AI is a threat though.
T: And ET too, right?
W: Right. But aren't all these the same thing? The same sort of thing, I mean?
T: What sort of thing?
W: The theological sort.
T: (Pause.) You've lost me.
W: Well, who's making the simulation?
T: Umm, yeah....
W: ET's, right?
T: Okay.
W: And the super-smart AI -- isn't it kind of god-like?
T: Oh, I don't --
W: I know that's not what the AI-fearers want to say. After all, it will have been we who created the God in that case, not the other way around. But isn't there something God-fearing about it all, just the same?
T: No, come on --
W: As with ET-fearing?
T: Certainly not. Especially that last. It's simply the problem of superior technology, and the encounter between mis-matched cultures.
W: One of those SF types once said, apparently, that any sufficiently advanced technology would be indistinguishable from magic. Would you agree?
T: Mm.
W: But then how would magic be distinguishable from miracle?
T: Mm.
W: And who or what produces miracles other than divinity?
T: Yeah, I see where you're --
W: So that's what I mean. Super-smart AI, or scary-smart ET's, or whoever's behind the simulation -- it's not distinguishable from what people used to call god or gods. Right? C'mon. What's the difference?
T: (sighing) Lots. Lots of difference. All those things are at least part of the natural world. God or gods are supernatural. And God is --
W: Yes, but --
T: God is a benevolent agency. As the supposed creator of the natural world, God would pretty much define benevolence. Whereas the sort of agencies you're talking about are simply superior to humans. There's no guarantee they would share any notion of benevolence we would recognize -- that's the point of the worry.
W: Yes, that is the point. I grant that those sorts of agencies are not identical with certain traditional notions of God. But other and older notions, on the other hand, compare the gods to wanton boys, toying with us like flies. And that's the idea that has these new god-fearers worried, as you say. They present their fears as rational, of course -- that's the point of locating them in the "natural world" rather than the supernatural, since the latter comes across these days as merely primitive and old-fashioned, to the cognoscenti. But their imaginations have extended the supposedly natural world into regions indistinguishable from the old supernatural, and then populated those spaces with dragons. They, the enlightened, suppressed the God of their fathers, but now find themselves facing the return of the suppressed -- the mad, or bad, Gods.
T: (with a laugh) Yeah. Look, I don't want to venture into the murky waters of theology with you. But these aren't your average idiot pundits -- these are seriously smart people, Musk and Hawking, and there are others, also very smart. All saying that their worries are real, not mere superstitions or mad gods, etc. So --
W: True. And like most people you can just accept what they say about them, add these esoteric concerns to the stack of worries about ordinary global catastrophes we have little influence on, but also little immediate effect -- like killer comets, supervolcanoes, thermonuclear war, climate change, etc. -- and get on with what we call life. But what I'm saying is just that we could take a minute to stand back from that stack and examine it. Some of the items in it are pretty obvious, some --
T: And what they're saying is that there are some thing things we could and should do about them.
W: Yeah? What do we do about living in a simulation? Where do you get those red pills?
T:
W: Yeah. And the point is, that's just a more obvious example of a trait all these outré fears share -- they're all at the far end of a logical chain that's ... fragile, open to a myriad unknowns, known and unknown unknowns. It's like the argument of an intelligent paranoid, each step or bit of evidence superficially plausible in itself, but the concatenation makes the whole construction absurd, like a Rube Goldberg device.
T: Why don't they see that themselves then? Too stupid?
W: No, they're smart enough. That may be part of the problem in fact -- they're not immune to the sort of hubris that can afflict intellectuals of all sorts, the attraction of the esoteric. The other part, maybe the larger part, is that intelligence doesn't protect against the sort of metaphysical or ontological fear that haunts all cultures, and that provides the real basis for the elaborations of the various religions. But this isn't really about personalities, it's --
T: So why don't they see that themselves, their own "ontological fear"?
W: Yeah. That's a good question. I think because it's an embarrassment. I think it embarrasses people generally in these secular days. Especially those who think of themselves as people who've moved beyond "primitive" fears, which they consider forms of superstition. But I think these sorts of worries are prima facie evidence that they haven't.
T: So Musk, Hawking, et al are just like hunter-gatherers worshiping Thunder gods?
W: Yes. Well, two things: first, I think they've regressed past the point of worshipping. That takes some sort of theological structure, and they don't have that anymore. That explains why they resort to these -- let's call them hyper-rational but really, they're just pseudo-rational -- fears to fill that gap. And second, it's not really fair to single them out. They're quoted because they're celebrity wise men, and because they come up with ways to make abstract fear concrete. But all of us today -- or most of us at least -- have that fear, even if we don't have a form for it, and certainly not a coping structure.
T: Speak for yourself!
W: Oh, not you, Turt! You're an exception. You're not scared of no void! Why, all you have to do is slap a label like "singularity" on it and it's covered, in more ways than one.
T: Wait, what?
W: The void. You know ... the abyss?
T: Yeah, yeah, the abyss. I can outstare the abyss --
W: Right. You're a monster!
T: How'd we get off on this?
W: Sorry. Let me backtrack a little. Take the super intelligent AI. One of the arguments used to scare people about it imagines something called a "paperclip maximizer". The idea, if you haven't heard of it, is that our AI has been programmed to maximize the production of paperclips. Since it not only learns, but learns to learn, super-fast as computers are, it becomes super-intelligent, and before we know it it's converting the entire earth, including human beings, into paperclips, and then goes on to convert the universe into paperclips. It's funny, right? But also serious in that it's meant to show that the goals or values of a super-intelligence may be not just alien to human values, but absurd or meaningless. So absurd, in fact, that it's scary. It's all over the Internet, look it up. It makes people shiver, and to some extent that's just the thought of a Mad God-Mind, wreaking cosmic havoc. But the very absurdity of the image -- the paper clips -- is the clue that it's touching on some deeper fear.
T: Namely?
W: That maybe we're just paperclip maximizers ourselves. Hm? Just as absurd, I mean. What if that's the fear?
T: Your point being what? We fear absurdity? Isn't that old hat?
W: Yeah, old wine, new bottles. The point, though, is that underneath a superficially rational, technical concern there lies an old anxiety about the loss of meaning in a universe lacking a divine purpose.
No comments:
Post a Comment