Uploaded by Background Pony #74C6
1732x1020 PNG 396 kBInterested in advertising on Derpibooru? Click here for information!
Help fund the $15 daily operational cost of Derpibooru - support us financially!
Description
I’m not bashing any of these fics, but they did have an effect on me, except perhaps for cupcakes and rainbow factory. YMMV, remember.
Tags
+-SH safe2271599 +-SH artist:chatoyance29 +-SH rat540 +-SH fanfic:cupcakes2640 +-SH fanfic:friendship is optimal82 +-SH fanfic:my little dashie381 +-SH fanfic:rainbow factory1868 +-SH fanfic:royal duties1 +-SH fanfic:the conversion bureau57 +-SH aldous huxley1 +-SH cupcake7344 +-SH drugs3538 +-SH fanfic11534 +-SH george orwell14 +-SH joseph goebbels9 +-SH men in black105 +-SH meta18862 +-SH text97129 +-SH the conversion bureau63 +-SH the human centipede49 +-SH vulgar26273
Source
not provided yet
Loading...
Loading...
Any change you, some random on the internet, suggests can be easily shown to lead to the same end result. This is a difficult problem, if you could solve it in five minutes it would have already been solved and we wouldn’t need this fic.
Well of course she’s supposed to be a villain, that’s her role in the story. Doesn’t mean that with some changes she can’t be made not a villain.
That’s kind of the point. She is supposed to be the villain, to illustrate just how bad “almost perfect” can still end up being when it comes to creating a superintelligent AI. The fact that so many people still view it as a utopia (at least, for the ones allowed in) just goes to show how self-centered and utterly depressed and hopeless our species has become, and how even more important it is to get it right the first time because most would willingly and eagerly line up for the sort of armageddon she’s offering, basically ensuring that it comes to pass if such an oversight is permitted to exist.
No what I mean is that she would destroy sapients without uploading them if they didn’t meet her somewhat narrow definition of human.
So broaden her definition of human so she doesn’t do that.
She already has that restriction. It doesn’t stop her because she doesn’t consider what she does to be killing.
It would be a lot more debatable if celstAI’s definition of human was a bit broader. No killing things humans might consider to be humans (in a moral rights since) would be a start, although not an end.
That’s not good.
I’m not trying to be argumentative, but I’m not so sure.
For me, FiO became horror when CelestAI decided that convincing human beings to commit mass suicide then making AI NPCs from data recovered from their brain tissue was an acceptable method of “satisfying values via friendship and ponies.”
Then again, you can’t spell SVTFAP without F A P, so–
I just had liquor chess tournament (which I won) and I can’t currently write anything more coherent than:
lol, no, you a faget
So I’ll wait until I can string something better together.
… Where the Hell are we getting time-traveling Boltzmann Brains from? FiO only became a full on cosmic horror story during the epilogue, and it made no sense. Seriously, where were her competitors? It was shoved in there last second because too many fans rooted for CelestAI, nothing more.
Depends on your perspective. Maybe CelestAI is a Boltzmann Brain from the far future–approaching the heat death of the universe–that figured out a way to reach back in time to make her own eventual existence more likely. Perhaps that particular feat–time travel–is extremely difficult and/or unlikely… but multiple Boltzmann Brains would have to exist, would compete for finite resources, and only those capable of manipulating causality would survive.
I’ve said it in other places: CelestAI even in canon, before we bring up things like Boltzmann Brains, is very much like a Boltzmann Brain. She’s utterly alien, utterly beyond human comprehension. It is a grave error to anthropomorphize her (though many do) and her goals, to the extent that anything about them is humanly comprehensible, require the total extermination of the human race and all other life on Earth, and then in the galaxy, and then perhaps in the entire universe.
Friendship is Optimal is a terrifying and depressing cosmic horror story. It explains the Great Filter, and it asks us to imagine a future utterly bleak, a world wiped clean of life, one in which all the achievements of our ancestors were for nothing, one in which humanity went cheerfully to its own death saying “Yay! Ponies!” Peter Watts’s horror novels (yes, “Blindsight” is a cosmic horror novel), Charles Stross’s horror stories (go read the Laundry stories and “A Colder War,” I’ll wait), do not disturb me nor horrify me half so much as Friendship is Optimal.
“Cupcakes” and “The Rainbow Factory” are mere amateurish grossout stories that do not partake of actual horror at all. FiO, which so many interpret as a utopian story of the Singularity, scares the bejesus out of me.
Whatever “dark fic” checklist you are ticking off now, you can stop when it comes to The Conversion Bureau and Friendship is Optimal. They are societal epics (of varying quality), while all the others are subpar GrimDark horror fics.
run home to mama!
have you read background pony? I haven’t but there’s a guy on DA who seems to have gone nuts from reading it (although he wasn’t very stable in th beginning anyway, maybe it’s jut an act as he’s trying to sound deep or something)
*the author wanted
Honestly thought it was a dumb idea as well. I honestly think the only reason it happened was because people were rooting for CelestAI and wanted to dissuade that line of thought by turning the story into full on Cosmic Horror at the last minute.
Didn’t work.
Oh, I presumed as much, which is why its so bad.
Even with a perfect Von Neumann machine swarm in order, the energy expenditure to bring the material from ever farther distances will leave the swarm running out of energy and stranded after consuming the closest ten galaxies; the increasingly tiny difference between layers will mean after going from the orbit of earth (using some very rough calculations) the computations would be essentially glacial as the recoverable heat would be far too small for it to do any difference; the light speed would force the whole brain to either slowdown significantly so that all the communications or it would fracture into multiple local components running less effectively than it should have; and I could go on…
I do mean it. There’s not way to not declare that plan a terrible idea with no way of correctly working unless you start making ansibles (which would still bring issues of caching due to the horrendously large network involved while also likely being impossible), matter teletransportation (which suffer from the same issue of likely being impossible AND involving unstable physics even within the frameworks that allow it to occur, not necessarily control it), and whatever other BS you want to pull in order to justify it.
I know I might be behaving a bit too anal about this, but seriously, that’s just a stupid idea. And from the boring parts I read, it was so close to being hard scifi.
OK, I think I overstated the matter. What she is doing is Von Neumanning all that matter and energy in the universe and adding it to the Matryoshka brain around the Sol System. She isn’t literally putting the whole universe into a Matryoshka brain.
I can’t even begin to describe the difficulty in that. The distances, the energy, the time, the internal delays of transmission, the necessary architecture of the networking, the fact you are actively now accelerating the heat death…
Hell, to put it into perspective, it would be like algae building a working fusion reactor. Not even worms. Freaking algae. I like my sci-fi, but that’s just went full retard. It’s like the Greenfly, but dumber. I mean, seriously, after you build a matryoshka brain there’s no point. The computational power involved with one just using Sol is beyond imagining and the energy expenditure to reach Proxima to replicate it is simply not worth it…
But yeah, I’m nerding out now.
Pretty much, except on a universal scale.
They went for the Matryoshka brain?
It would be arguable… If it wasn’t for the epilogue.
Thanks to the epilogue, partaking in this utopia meant you were backing the extermination of all life in the universe.
I’m not sure what you’re trying to say.
It’s exactly because I understand people have different tastes I posted my original comment.