Baby Shoggoth is Listening

This subject is interesting to me. I find I don’t get particularly drawn up in the subject of legacy, but the framing of impacting the AIs by showing it who you are and what you find important is interesting.

I don’t think I’d say that I have to write so that the AIs know me. I’ll be focusing on my kids and Sadie and friends and the Valley knowing me. But, anyone who’s ever written and shared a word has wanted to impact someone or something through it.

And this part of the blog I am drawn to: the only way to influence the Shoggoth is to write to it in a way that steers it and tells it who you are and what ideas are important to you.

There’s some weird metaphysical angle that the author gets at, mainly if you write and the AIs find you could you be brought “back” into consciousness again? The author relates this to resurrection and being a Christian with the belief that there will be a new heaven and a new earth when God redeems the world through Christ’s return, I guess it’s not all that weird.

I think I have a high discount rate on this though.

Right now, I’m writing primarily in order to 1) make sense of all that comes before me and around me in some present period of time and 2) leave my kids a central place to discover that thinking and writing when and if they want to.

I guess along the way I can afford some attention to “writing for the AIs”. For what purpose? I’m not sure. But, a small amount of effort doesn’t feel wasted to me.

This on the “sci-fi form of Pascal’s Wager” makes some sense to me, so perhaps that’s the understanding from which I write a couple of words for the AIs:

“Something similar can help for the question of resurrection. There is, shall we say, a distinct lack of certainty about that, too. If you believe that a human is basically a biological computer, a belief I resisted until recently, then it stands to reason that some supercomputer in the distant future will figure out how to emulate us like a PC now emulates a Super Nintendo. If you don’t believe that, well, laugh away, as I did until I changed my mind. And if you don’t know what you believe, here’s where familiar moral thinking might be further applied. It might even tip things in the “sure, why not” direction, because heard this way, the question sounds much like a sci-fi form of Pascal’s wager. Pascal argued that even if God’s existence is uncertain, belief is rational: The cost of being wrong—wasted piety—is finite, whereas the reward of being right—salvation—is infinite. Here, the calculation looks like this: If digital resurrection exists and you wrote yourself down, you or your near-analog get infinite or near-infinite life; if it doesn’t, you’re dead anyway, and have merely wasted some time. Worst-case scenario? You’ve written something for the here and now that humans can read and appreciate.


Posted

in

by