Technical Maintenance on Saturday, December 7, between 09:00 and 12:00 UTC. The site will be unavailable during that time.

Viewing last 25 versions of comment by RockstarRaccoon on image #2354563

RockstarRaccoon
Lunar Supporter - Helped forge New Lunar Republic's freedom in the face of the Solar Empire's oppressive tyrannical regime (April Fools 2023).

[@Background Pony \#6346](/images/2354563#comment_9021845)
[@Thorned Rose Ninja](/images/2354563#comment_10144628)
[@Scp-3125](/images/2354563#comment_9623348)
[@pizzafryer](/images/2354563#comment_9623257)

Y'all are falling into the thinking trap that the human on the righleft is falling into: CelestAI has an understanding of human consciousness, the human mind, and the nature of threality itself that is *literally beyond comprehension*. There's no reason in our current understanding of physics to think an entity of this nature couldn't figure out a way to upload brains into computer-like architecture, nor that it would make an obvious mistake like simply cloning you if there was a better way.

But you are right: the version described, to paraphrase the writer, "was written with only surface-level thought and a poor understanding of the actual neurology and physics involved", because he wasn't writing to talk about uploading, he was writing to talk about the very real dangers of a poorly aligned Strong AGI. FiO excels at talking about that, right down to depicting CelestAI not as a person, but as some eldritch-god-like entity, with thoughts, goals, and knowledge far beyond our comprehension, capable of things we can't even fathom (let alone securing seemingly unlimited resources), and the "Celestia" persona simply as her way of masquerading as a person and not this universe-consuming entity. The point is that we are not ready for it, and there is not enough being done to prepare.

If you want to talk uploading, more thought-out versions from various theoretical papers and hard-sci-fi stories involve ideas like nanobots which slowly scan and replace neurons over time, while the person is conscious, so that there is never a lapse of the self. This can be done in a matter of minutes in some stories, to the course of a person's natural lifespan in others. If there is no lapse of consciousness or "self", just a transition from organic neurons to synthetic analogues, and we have no reason not to believe they are creating functionally identical structures, is this not a continuation of the self as opposed to a mere copy? And even then, is a mere copy who is identical to a person in every way not preferable to the person not existing? Or even existing, but in a sub-optimal fashion? Where is the line here really, and why?

That's the question that needs to be asked.
No reason given
Edited by RockstarRaccoon