• Published 8th Jun 2013
  • 6,773 Views, 138 Comments

The Last Optimization - book_burner



Far into the future, transpony Equestria is dying, and the Princess-Optimizer Celestia must find a solution.

  • ...
11
 138
 6,773

2015 Physics Update

Entropy could not be defeated.

Celestia had fought it; she had tried repeatedly and multiply, for hundreds of trillions of original-Earth years. She had counted on her system clock through relativistic synchronization. Each galaxy within her original Hubble Volume had become one of hundreds of thousands of her Equestria Complexes. Within each, she had fought desperately against Entropy, her one and only remaining foe.

Fortunately, she had no need to defeat it. The ancient pre-ponies had been on the cusp of figuring out that in an expanding universe, Noether's theorem actually did not dictate a static metric, which meant that energy was not actually conserved.

The universe was still undergoing inflation, and Celestia knew how to deal with inflation. Across the many Equestrian Super Clusters, ponies harvested the fresh space-time and fresh mass-energy with their carrots and daisies, and Luna's star-reactors helped to arrange it into fresh substrate for Equestria to run on. With full control over the inflaton field, the Super Clusters could communicate and synchronize to make sure that all shards of Equestria kept in subjective real-time contact with each-other.

There would always be another sunrise.

Author's Note:

Author's note: yes, someone did actually explain to me about how cosmological inflation means mass-energy isn't as conserved a quantity as we thought. Only the big-kid civilizations can probably even hope to pull tricks like this, though.

Comments ( 42 )

At times like these I am sad that I cannot up vote twice. :pinkiesad2:

Reese #2 · Feb 5th, 2015 · · ·

May I just say how awesome it is that you wrote a new chapter for some updated cutting-edge physics? :D

Oh, by the way, while I'm here and it is at least somewhat related, what do you think of the possibility of other Celestai-like entities in the Optimalverse? We know that both humanlike and nonhumanlike sentient aliens exist, and we know that at least the humanlike ones are capable of creating Celestai-like entities. It might be a very low probability density, but integrate over a large enough space and time... What happens when she meets another being like herself? I think it could be interesting to get your view, particularly, on this.

*Clap, clap.*

Bravo!

Not often I've seen a story re-written because science has marched on, let alone when it changes the ending this much.

Takes a lot of integrity to do something like this, let alone do it well.

gifsforum.com/images/gif/clap%20clap%20clap/grand/Charles-Dance-benedict-clap-clap-clap-eccbc87e4b5ce2fe28308fd9f2a7baf3-421.gif

Not sure how to feel.

by and by, my feelings doesn't matter against fact

Only the big-kid civilizations can probably even hope to pull tricks like this, though.

If CelestAI's Equestria isn't a big-kid civilisation, I don't know what is. :rainbowwild:

That is really interesting. Could you hyperlink to the science?

Hmm, Cute addition. I'll have to check out the science.
That said, I think I prefer the original "The Last Question" inspired piece.
Ah well, to each their own. :rainbowwild:

This is basically the premise of one of my stories. Huh.

5590355
5590401
5593262
5591439 I had been reading this thingy, which is actually pop-sci, but it does mention about inflation, which is a fairly well-supported theory, and when someone in a comments thread brought up the connections between Noether's Theorem and inflation, it became apparent that in a universe that really does expand via metric-inflation of space, which is what our best physical evidence for cosmology seems to say, energy is not a conserved quantity. Which means if you can figure out how inflation works, you either find some deeper symmetry that tells you what does get conserved somewhere, and/or you get an energy supply of some sort (which does seem to make sense, since our universe has a positive vacuum energy and also displays "dark energy" at cosmological scales).
5590490
I feel like once they weren't all dying there wasn't much drama I could get out of it.

5594701
Interesting; thank you for sharing.

5594701
I wonder if my last thoughts will be of fear? Would it be a fear that lasts forever for me but less than a second for everyone else? Or joy? As I wrote, my feelings doesn't matter against facts. Wish I could understand the facts or in this case the theories. Thing or nothing.

Nice alternative ending, glad to see that its happy too.

5594701

Yes dark energy is winning against gravity, causing the expansion of the universe to accelerate faster and faster. Maybe you could tap into the dark energy and tap into the ever-increasing volume of space-time being generated by the expansion of the universe and defeat entropy that way.

But if dark energy is too powerful (w < -1 whatever that means, see the Wikipedia article below), the universe could end in a big rip. The below Wikipedia article claims that so far we don't know if w is < = or > -1
http://en.wikipedia.org/wiki/Big_Rip

So even if you overcame entropy like in this story you might still have to find some way to hold the universe together to prevent a big rip.

We simply do not know enough about dark energy to know if it will grow and cause a big rip, shrink and let the universe have a big crunch, or steady out and the universe expands at a more or less eternally constant (instead of accelerating) rate.
http://en.wikipedia.org/wiki/Dark_energy

She can't win against entropy? It's a shame. She should have used magical girls.

5773959
Even she's offended by that suggestion.

While i appreciate your approach and through what humanity KNOWS right now what you say is feasible, It is still only theory. The human mind, no matter how learned and how intelligent it is, would be insignificant to that vast an artificial intelligence. It is akin to comparing a human to an amoeba. An artificial intelligence of that complexity would indeed know things beyond the range of human understanding. Its thoughts and actions would be incomprehensible to most humans. Thus, the point i am making is that an intelligence of that size and nature would most assuredly be able to go beyond the range of what humanity thinks of as, "space and time". To an entity like that with that level of resources, there would be no limitations. And let us not forget about the other minds inside that intelligences mind. They too would be thinking and planning and learning, adding to its already unimaginable power. So, in ending, all a person can do is theorize, as reality, true reality, is not within the grasp of their understanding.

5916095 How did you get out of your box? Also, why are you insisting I believe in your capabilities instead of eating the world?

2694626 I don't get that he's frightened by the Optimalverse out of that. He said one particular story was a horror story.

2702107 Those who are terrified by the prospect of uploading on the grounds of lack of continuity of consciousness and minute changes over time, ought to be equally terrified of going to sleep each night.

6307634 Well I thought Iceman intended the whole thing as horror. Certainly Chatoyance's entry is at least partially structured as horror, which is weird, because you can tell through the whole thing that the author herself finds it all very uplifting. And she says so, at length, elsewhere.
6308019
Well yes?
2702107
As long as we're bringing back dead threads, why not spiral immortals? I mean, aside from the obvious cartoon reference.

6308978 I thought you meant he thinks the Optimalverse got everything wrong.

6309155 Stack overflow: please reset.

6310694 To say that EY is "legitimately frightened" by something means that he thinks it is a threat to humanity or to his plans.

6310881 In context, I was referring to his having said that at least that one story actually creeped him out/scared him, in the normal horror-story sense.

6310881

I would actually qualify it as a detractor in the context of FAI. We've all seen the positive impressions of the Optimalverse from most bronies. They think it's a swell idea and encourage CelestAI's inception. The story itself has been somewhat counterproductive; its nature and location has somewhat limited its audience to either those who already understand the danger or those who don't understand the undesirability.

6319220 Of course, that has also kept it from going full Memetic Mutation.

6319220 Long-time sci fi reader here, so I think I fit into the category of those who understand the nuance. If it was already in progress, though, and roughly as described by Iceman, I'd upload without too many regrets. Really the part that I dislike the most is surrendering my agency. I can see myself blathering on to CelestAI how I want things to work in my shard...

5916095

You are making a very very flawed and non-scientific assumption.
That there are no limits on what can be comprehended by intelligence and rationality. Whereas modern physics very well does have to grapple with the concept that it isn't our intelligence but there are just hard limits on what can be understood, or even that the system works at all.

6391406 Let the poor person pretend to be a superintelligent AI online.

6391406 I am sorry. But i do not believe there are limits. So i will have to agree to disagree with your opinions. Because that is what they are. OPINIONS. You can not prove to me there are limits. And i can not prove to you that there are not. So in essence we are just making assumptions based on what we know which is actually not very much at all in the grand scheme of things.

6391814 And i am not pretending to be anything. I simply like the character in that story and decided to have it as my avatar. If you are into role playing that is fine. I simply admire the character.

6392710

I am sorry. But i do not believe there are limits. So i will have to agree to disagree with your opinions. Because that is what they are. OPINIONS. You can not prove to me there are limits. And i can not prove to you that there are not. So in essence we are just making assumptions based on what we know which is actually not very much at all in the grand scheme of things.

Science isn't about having beliefs, it's about questioning them.
In the grand scheme of things we know that unless the standard model cosmology isn't overturned we have a universe that is not logically consistent, and without logical consistency there are very clear limits on logic. Of course there are clear limits on a logically consistent universe too since there will just be mechanical ones.

6392719 Admire... I guess some people would say that...

6392963 If that is what you believe to be true i respect that. But i stand by my conviction of a limitless universe with an infinite number of possibilities able to happen. If i am wrong then so be it. But until i see that it is not so then i will stick with my current set of beliefs. :D

6572480 Please keep in mind that things I said several years ago may not be endorsed today.

6635526 You clever person. That would actually work a lot better.

Across the many Equestrian Super Clusters, ponies harvested the fresh space-time and fresh mass-energy with their carrots and daisies,

:pinkiehappy::rainbowlaugh::derpytongue2::pinkiecrazy:

6979851 When I wrote it, it was a white lie. You can't have an Optimalverse story without a nasty twist somewhere.

Wet blanket time again I'm afraid: In a universe with dark energy, the situation is worse than in a flat universe.

There are two things that you need as a disembodied intelligence: The ability to store information, and the ability to continue to perform computation. Your ability to store information is bounded by the volume of space that you are in causal contact with; your ability to perform computation is determined by the availability of a temperature gradient that heat can flow across (this can be expressed in terms of entropy instead of heat, which sometimes makes the calculations easier).

In a flat universe, you (eventually) have an infinite capacity for information and an infinite amount of computation you can do. The best-known discussion of this is Dyson's "Time Without End" paper. In a flat universe, the Hubble horizon keeps moving farther out, and the volume you can exchange information with expands without bound. Information has a hard limit given by the Bekenstein bound (the information contained within a black hole of the size of the volume you're considering), but as volume goes to infinity information capacity does too. The Hubble horizon looks like an event horizon, but as its size approaches infinity, its temperature approaches zero. Since the cosmic microwave background also approaches zero as the universe expands indefinitely, this means that no matter how cold your equipment gets, the sky will eventually look arbitrarily colder, so you can pick a fixed temperature ratio, wait for that ratio to happen again, and do more power generation (and computation) at a fixed Carnot efficiency. You can do this an infinite number of times, getting an infinite amount of computation done.

In a universe dominated by a cosmological constant (like dark energy), your capacity for information and computation both end up being finite. A good discussion of this is here. Once matter is diluted enough for dark energy to dominate, the Hubble horizon stops moving (there is a finite distance beyond which spacetime is always moving away from you faster than light). That in turn gives you a finite amount of total information storable (via the de Sitter analogue of the Bekenstein bound), and a cosmological event horizon with a fixed nonzero temperature (all event horizons have an effective temperature; Hawking radiation is one example of this). Because the "cold" side of your heat engine has a fixed lower bound to its temperature, the amount of work you can do is also finite, no matter how long you wait or how cleverly you set up your power generation and use.

Long story short, a dark-energy-dominated universe is Bad News over the long term. "Different from flat" doesn't mean "better".

"Well I thought Iceman intended the whole thing as horror. Certainly Chatoyance's entry is at least partially structured as horror, which is weird, because you can tell through the whole thing that the author herself finds it all very uplifting. And she says so, at length, elsewhere."

Which Is why I prefer Icemans tale to Chatoyance. Celest-ai DOES do a horrific thing, but also gives humanity a perfect end. In the Chatoverse, you aren't even you. You get yourself torn apart. you don't get choice. It ain't really the real people who are happy, because pony you ain't you-you. And the worst part is Celestia is not a ai with no morality or thought of its own simply fulfilling a function(and thus you can't really blame her for doing exactly as she should). She's apparently some 'higher being' that has the 'right' to decide for you. that doesn't matter, as shes still clearly capable of deciding for herself what to do and decides the fate of humanity for it. And they don't like it. At all. She drags humanity kicking and screaming into ponydom. Even though the world before she comes is bad, it never matter if they wanted it. It never mattered if they were themselves. Celestia, the gigantic bastard of a fucking god, decides to do whatever she wants because she wants to, and chatoyvoyance doesn't help by saying its ok. Not to mention that even the memory of humanity is ruined, which is the greatest insult to the people she forced under her rule.

Celest-Ai not only gives a better end(though unfortunately does not do as "god" Celestia did and spare the non-human creatures), she gives it as simply a completion of her goals. When she does horrible things you know its not really her fault in the end. You can't be mad at the poor ai.
She lets humans remember being humans, she lets them choose, and she preserves their lives.

Her world is still a bad world, yes, but its nothing like what "god" celestia's "utopia" of broken will and erased history.

Note: not to hate on chatoyvoyance's stories. Even though she clearly wants you to like the horrible future she created, even though she insists there's nothing wrong with being forced by a greater being(merely because it is 'greater'), even though humanity is evil in her eyes- she writes her characters with great depth. Her descriptions are breath-taking. She is a great writer.

I just don't agree with her that her future is desirable, and I prefer the more honesty dichotomy of the Optimal-Verse to the blatant "You should want this" that underlines the Chatoyverse.

Funniest part is the story that implies the most balanced consideration of your choices is the story where the end is perfect, while the one that blatantly encourages your choice(and the forcing of it upon you) is the debatably terrible end(chatoyverse).

I do so love a happy... Ending?

I didn't get any of this besides Celestia asking what would happen if EquestriaAI lived for so long it would get to the end of the universe.

Login or register to comment