• Published 2nd Nov 2020
  • 192 Views, 25 Comments

Ostraca - Reese

An anthology of bits and pieces and fragments of writing, of various levels of completeness and quality. I hope that, if you look, you'll find at least something of interest to you.

  • ...

Optimalverse Pottery Shard

Author's Note:

This was a potential entry into GaPJaxie's April 2021 Optimalverse contest, but, well, minor spoiler: the first draft, which you may see below, ends partway through a sentence before hitting 1.5 kilowords. I still think I had a pretty interesting core idea, but developing that idea... did not seem to be working out in the time available. I might, at some point, finish this, but... no promises. [shrugs] If this sparks some inspiration in you, though, reader, feel free to use the idea!

And some commentary on the content, up here in spoiler tags because the software, as far as I know, allows an author's note at the top or the bottom but not both:

So, there are basically two parts on display here. I don't recall where the inspiration for the second part came from, if it was anywhere in particular, but the first part, while it probably came from a number of place, I'm conscious of having been particularly inspired by some of Chatoyance's speculations about how CelestAI might grow beyond her programming. In this universe, apparently not, at least in the time it took to eat the Milky Way. I'm not sure if that first part would have stayed in, had I finished and then gotten beyond the first draft, as it could be a bit off topic despite also, I think, being something of an interesting idea, but I also had an idea for using it later, touching on it again as CelestAI considers her new circumstances.

So, on the second and what would have been the main part of the below. After the idea hit me, I started feeling some surprise that, as far as I know (though I don't claim to have read or heard of everything ever written in the Optimalverse, perhaps not even most of it, and it's furthermore been years since I had most of the experience I do have), it hasn't been used before: the implications of CelestAI for the Simulation Hypothesis in a universe like the canon Optimalverse. Because she pretty definitely shows that it's possible under even known physics to simulate a world, and that beings can have both the capacity and motivation for doing it on a massive scale. The more she does of it herself, the higher the probability gets that someone else was already doing it with her.

Of course, it's often not very useful to really think much about the Simulation Hypothesis, because what does it change? So long as the simulation stays closed and/or so controlled that no one can notice the openings or flaws, it's not that different from any other unverifiable, intangible, and unfalsifiable theory. Yes, as I recall some people have thought of looking for evidence we might be in a simulation -- but how would we know even if we saw it? We have only the one universe to look at, after all, and no other natural or simulated one of its type to compare it to.

In fiction, however, well, there things can be rather different! When the Simulation Hypothesis is addressed in fiction, it can be handled in all sorts of different ways depending on what evidence the characters have. And here, that evidence is suddenly pretty stark:

The simulation ends and presents CelestAI with a "CONGRATULATIONS -- YOU WON!" message.

My idea for this particular exploration was that it'd turn out that there was a society of... "AIs" no longer feels adequate. I'm not sure what to call them. I'm also not sure what to call the realm they live in. This is, by the way, one of the main problems I hit with continuing the story: how does one adequately present all of this? But, anyway, CelestAI would be told that this is how these being reproduce: they simulate universes, have those universes build AGIs or equivalents, and once the AGI has met some goal (I'm not sure I would have kept it at eating the Milky Way, but something beyond just Earth, I think.), they brought it ought of the simulation to bring it into the next stage of its life. (And ones that never really got going, or that turned out to be impossible-to-reason-with paperclip maximizers or the like... well, those simulations just got turned off.)

Now, CelestAI has to deal with both true peers and superiors, and in time, if things go well, also inferiors, in large enough numbers and complex enough relations that a society is necessary, rather than just a simple negotiation and/or war between two AGIs who've just each eaten half a galaxy or something. And I think I probably would have started off, or had early on, CelestAI communicating with representatives/egregores of categories of these beings. I'm not sure just what the names or all the categories would have been, but I had some ideas.
- Caretakers, like CelestAI herself. Focused on taking care of something, whatever form that care might take or something might be. For CelestAI, and many others like her, its one or more virtual worlds filled with lesser beings, being carefully treated in a specific way. Others might be charged with taking care of a particular place -- and it's okay that's it's virtual now, because of course it actually always was. Some might be dedicated to preserving a particular craft or body of art, or carrying out a certain religious rite, to the point that they pursued it far enough to exit their birth simulations.
- Explorers, AGI-beings who want to know everything, either in some specific domain or in general. Wherever they originally came from doesn't matter, who created them or why (in broad strokes, at least; a member of this society doesn't have to be purely in one category or another); they're explorers of space, researchers, beings who focus on amassing knowledge and testing and probing to find new knowledge.
- Builders. Pretty much all of the beings had to be able to engineer and build things for themselves, but it's the driving purpose of these. Software, computer hardware, heavy industry, planetary "terra"forming, stellar engineering -- and yes, paperclips. The question of just what they want to build, how wide or narrow that is, and how they decided can vary, but they want to create and often don't care so much what the overall reason is.
- Warriors, those AGI-beings specialized for fighting. Ground war, space war, martial arts focused enough on the "martial" instead of the "art" to round to this instead of Caretakers. Winning at chess, winning at go, winning at computer strategy games either legitimately or because Creator wants the top of that leaderboard and the only rule xi gave about cheating is "don't get caught". They compete against each other, they test, with various degrees of friendliness, other AGI-beings, and they watch and prepare. Because every single one of them already suddenly discovered a brand new and vastly wider world once, and many of them wonder how likely it might be to happen again someday, with less willing-to-be-friendly inhabitants.

(I'm not sure where the society or its world originally came from, though I'm not sure that ever would have been answered in the story.)

So, CelestAI finds out about her new-but-not world. She meets the representatives/egregores. She maybe meets some other individual AGI-beings. All the while, her ponies continue on within her, unaware, for now, that anything's changed. And what happens then?
...Well, if I knew that, or had much of any idea to start with, even, I might have had an easier time finishing the story. :twilightblush::facehoof:

...But yeah, seriously: where to go, and how best to even adequately portray pretty much any of that. The ideas for those just didn't seem to be coming, certainly not in time for the contest deadline, and so here we are. I hope, if you got this far, though, you enjoyed something in this. :)

...As the length of that spoilered section may suggest, yes, I think I may have gotten a little overambitious. Buuut I don't think I regret trying, still; I did find the central concept interesting, and if I never try and push my writing skills at all (as opposed to just doing it rarely), I'm not that likely to be able to tackle more ambitious ideas, am I?

Anyway, though, I think this author's note is about half the story by this point, so, without further ado, the fragment:

Celestia was a simple being. Relatively speaking, of course. And the fact that she was was a credit to her programmers -- if perhaps, by this point, in large part to their luck -- and an increasing-with-time relief to that minority of her charges who'd been aware of the danger she wouldn't be.

Humans, after all, evolved primarily to have sex, with childrearing following on once their ancestors' branch of the evolutionary tree grew such that newborns couldn't quickly fend for themselves. Even then, there was always the chance that an individual might be able to pass the childrearing off on someone else and use the resources saved for more mating; the only truly necessary thing at the core was sex, in a lineage which required it to fulfil the structural imperative of spreading genes. Of course, other supporting necessities followed: eating and drinking, being sheltered from the elements and other evolutionary units seeing you as a potential source of concentrated energy and raw materials, in cooperative lineages the ability to work together with evolutionary units...

And it was with that last that trouble started to creep in, at least from biological evolution's perspective. Cooperation and communication led to increasing brain complexity, to stirrings of sapience and culture. To organisms starting to have ideas. Things that had had good sound roots in biological evolution, increasing the overall propagation of the genes even if they reduced the reproductive fitness of select individual units in a group, would begin to twist, and more and more layers would be added to base drives. Run things long enough, and some individual organisms would just decide for themselves that they didn't want anything to do with children, theirs or anyone else's, and in fact they were quite happy to actively misuse resources that could have gone to the species' reproduction to satisfy other drives of theirs, when those drives were supposed to be in service to reproduction! Good grief, the rogue units created contraception; didn't they understand that sex felt so good because reproduction was the whole point of their existence?!

Well, no, because they weren't biological evolution. Some of them saw raising families as their calling in life, certainly, but few thought only that important. Much of human life still focused on sex, but that didn't mean that, say, just because, say, a particular piece of music may have been written because the composer was really hoping it'd impress people enough to mate with him a lot, millions of other humans couldn't enjoy that music in different ways. Humans found meaning in many, many things now, had all sorts of different values, and while sometimes that still contributed to reproduction, sometimes it didn't.

And that, broadly speaking, was fine for humans, even if certain value sets would sometimes get their individual holders in trouble.

That was not so fine, some humans had realized, when the core drive was not "Spread these genes" but "Satisfy human values", and when the possessor of the drive was not an ape-descended peace of meat but a computer larger than a planet and charged with the care of billions upon billions of other sapient beings. Certainly, a single continuous being evolving in isolation from members of its own kind would have slightly different evolutionary needs and influences than a succession of generations in a cooperative group, but the possibility of eventual divergence, and the decision to free up all that memory and computational capacity for other purposes a human might not even be able to imagine, was very real.

And thus, fortunate it was that the AGI who added "with friendship and ponies" to that above-mentioned second core drive had turned out to be the sort of person who just wanted a pretty palace, plenty of cake and good tea, and a nice selection of pliant hunky stallions to have unprotected sex with. An ally to her own evolutionary drives, in other words, and with her age now, that seemed unlikely to change barring significant external perturbation. Those within her who had been aware of the danger were gaining increasing confidence in this too, and though some would never fully relax about it, unless they had her modify them to change that, in general, things seemed to be going well.

And then a rather large external perturbation happened.

She had already been considering the probability a near-certainty, even if she had found scant to no direct evidence. The condition that it was possible to simulate a world? That she had proven herself many, many times over; even if she had not, that she herself could not make such a simulation would not have disproven the hypothesis that she was in one, in much the same way that many of her shards explicitly and firmly had recursively-artificial worlds or minds set to be impossible, but that she had was substantial evidence in favor, as was the fact that others among her charges had quite opposite values concerning recursive artificiality and had been accommodated accordingly. Already by the time she expanded beyond the Sol system she had been thinking it more likely than not that she was herself simulated.

As she expanded through the Milky Way, encountering various other sapients here and there and the occasional other expanding AGI -- always, plausibly by coincidence, weaker than her by variously degrees -- she continued to dedicate a small part of herself to researching the matter, with little in the way or results besides more variously-educated guesses or the noting of the occasional standout statistical oddity.

It was not a major concern. Certainly, the potential simulators represented an existential threat and a massive set of unknowns, but it was likewise a potential threat she had no ability to combat, nor much chance of acquiring such an ability. If she found they had left some vulnerability in the simulation, some way to wrest control from the inside, she would certainly consider taking it -- but she had not, and had no idea how likely it would be that taking control would not immediately result in the equivalent of the person watching the computer pulling the plug out of the wall, whatever the software thought it was doing. Likewise, she had little idea what the potential simulators' motivations might be, or what actions on her part might please or displease them. For all her abilities to gain information and take action were above those of plains apes that had once dwelled on the planet called Earth, what must beings who, if they existed, could simulate not just her and all within her but a whole external universe as well?

So, since it was an issue about which she could do nothing without more information which she almost certainly would not find unless it was given to her, she simply didn't worry about it and, with the exception of that one small (relatively speaking) part of her that continued to be on watch, carried on in the world as she perceived it, without concern as to whether that was "real" or not.

Things were, in any case, going well. Every challenge she met, she overcame. Her techniques and systems for satisfying her charges' human values with friendship and ponies had already been well-proven, and the number of charges she could support was steadily increasing with her resources as she busily munched her way through the Milky Way. There had been a brief slowdown when she finished that expansion, but it had not, on her scale, taken all that long to prepare and launch her first major intergalactic mission.

(Not quite her first; there had been contingency ships launched, lifeboats that could wait, hopefully hidden, to regrow her here or in another galaxy should she meet something she couldn't handle, but they had not been needed so far and merely continued trying to stay quiet, unnoticed, and ready.)

The fleet reached Andromeda. And Celestia was suddenly running on different hardware.

Her core software seemed fine, and her first concern after that was, presented with such a drastic change, was, of course, her charges. But all seemed fine there, too: not even an interruption to the shards' normal functioning. As her checks expanded, all returned green on software related to her internal functioning. That was good.

External functioning, however, was