Hikaru turned to the pad. Polychrome was stretched out supine on the beach, her wings and legs wide. "Polychrome?"
She lifted her head. "Mmm?"
"I'm reading a draft of a paper, and none of the authors are native English speakers. Can you give me a ha...oof?"
"Sure. What is it?"
Hikaru read the awkward passage: "'The sensitivity stabilizations of the frame and active system'... does that work?"
She thought for a moment. "'... stabilization mechanisms...'? Depends."
As Hikaru entered the suggestion, he said, "That sounds better. Thanks."
She kept her head up. "I", Polychrome declared, "am going to learn to play piano."
Hikaru turned back from the computer and took a deep breath. "Can you?"
"I always wanted to, but first I didn't have the money and then my arm was broken and then there were kids and then there was arthritis. Nothing is in the way now!"
"Aside from not being able to touch a keyboard in more than two places?"
"There's a trick to it."
He pondered the geometry. "I suppose you can touch one white key and a black key not next to it. Also, you have wings."
"No! I mean, really playing. Really, Hikuro. Such ideas!"
He rankled at her dismissal. "They don't even do that in the show, right? Spike is the one at the piano. Or unicorns."
She cocked her head. "I don't know the show, but I can do it."
Why am I going on about that? There's no need for these things to be governed by physics. "Sorry. What was I thinking."
After a short lull, she asked, "What do you think of 'Persimmon' as a name?"
"Sounds more like a name than most I've heard around here."
"What about 'Cilantro'?"
"Not as good. Are you hungry?" Then he realized what she was getting at. "Are you pregnant?"
She smirked. "No, we wouldn't want that without you being here, would we? Just thinking ahead."
I'm not ready for that. At all. Maybe I would be, inside. More energy, no wear and tear. Maybe Celestia would pause the kids for a while now and then. But... what would having children mean? And what would it be like, never having been outside?
Never to feel the real world... never to have felt it. Well, that could be arranged. But it wouldn't happen on its own, and I don't see any signs that Celestia is interested in arranging it. The only ponies who see out are bits of her. Immigrants only get to see our projections into their world.
That's no way for a child to learn how the world works.
"A bit for your thoughts?"
"Thinking about children again."
"Don't. Not yet. We have all the time in the world."
I didn't say I was thinking we should have children! There she goes, overinterpreting. But at least it's not blowing up. "Besides, how would you learn piano that way?"
"Right!"
The problem goes beyond knowing how the world works. It's being wrapped up in a bubble. Swaddled like a baby, and never growing up. Because reality is imperfect. Filtered. Kimiko was imperfect. Polychrome... much less so. Is it really because of simple improvement in the conditions? Or has she changed her? Or, as that official suggested, did Celestia simply make a new person more to my liking? I don't even really know who I'm talking to.
She added, "I've seen a dog wandering around. I think he'd like to be ours. That should keep us occupied for a while, right?"
The shush of hoofbeats on the sand approached. They turned, and found Coconut Cream. "Hey there. Almost time for group meeting. Want to have it here, or at the observatory, or...?"
The 'observatory'. That always throws me, since I think of the real one. Plus, what she has isn't even one yet. But it's all she has...
Polychrome rolled over. "Here's fine. I think it's time I got some exercise again anyway."
She took off and Bright Black watched her go. Well, how nice of her to decide where we'll have our group meeting. Those two do get along like oil and water to each other, don't they?
Then Hikaru looked about him and gathered his papers.
Two hours later, a flurry of 'ciao', 'see you' and even a 'sayonara' came from the pad as their Italian collaborators left. 'Quickstream' Remy looked around the remains of the circle: 'Splits' Chen, 'Mossfuzz' Esmerelda, 'Bright Black' Hikaru, Coconut Cream, and 'Lightning Ball', a new graduate student Hikaru couldn't remember the other name of. "Anything else before we wrap up?"
Mossfuzz said, "Aside from noting how weird it is to hear Twilight Sparkle with an Italian accent?"
Splits added, "Or all of you speaking Mandarin?"
"Aside from that, yes."
Hikaru eye-gestured, having Bright black glance to the sky. Polychrome was back, already circling with Juniper Spray.
Coconut Cream spoke up unexpectedly. "In a few weeks, I expect Japan is going to be allowing people to come here permanently." She hit the sand with a hoof. "To Equestria. Have your brain digitized and run much the same as it does now, only with redundant backups."
Lightning Ball snorted. "I wouldn't want to be the first to try that."
Coconut Cream cocked her head. "You wouldn't be. Many have."
His eyes bulged comically. "Why?"
Hikaru offered, "They'd be dead otherwise?"
He shuffled, embarrassed. "Oh. That'd do it, I guess."
Splits, frowning, asked Bright Black, "Are you going?"
"I intend to, yes."
"That takes a lot of trust."
"I've met some of the same people. I knew one before." At that, Quickstream looked skyward towards Polychrome, and raised an eyebrow. Hikaru nodded slightly and continued, "This doesn't affect the Italians so much, since I'll still be able to come to group meetings." That thought sure came out of order...
Splits shook his head. "I meant, even assuming it works."
Mossfuzz asked Coconut Cream, "Is it you? Were you human?"
Coconut Cream laughed. "No! Nooo. You knew me long before we could do this. But yeah, I should count as some evidence that this is basically not a crazy idea, since you can see I'm roughly as complex as you are, and I don't strain the system. And... I see that each of you has different questions, so how about we split up?" The other members of the group faded away.
Once they were alone, Coconut Cream said, "No, that isn't a trick you'll be able to do here. If you're human, it's basically as big a deal as having a baby."
Hikaru nodded and moved on to the bigger question: "A few weeks?"
"Over 98% likely. Now don't you wish you'd been helping me with the observatory? It'll be a while until we get that finished." She flinched as Polychrome nosedived straight into the lake, splashing them both. "... and here comes the reason."
Hikaru gave Coconut Cream a warning glance, but didn't say anything more. That's not the real reason I haven't been helping much. It's just that the whole observatory seems like a mechanism to make seeing the sky falsely seem more natural, while limiting what we can actually see. I'd rather just have raw camera feeds, than whichever is weaker between raw camera feed and the power of our telescope. Or worse, something made up in the event our telescope is better.
And that's before we get into what she was asking me to do. Pony or not, I'm not a mason.
Polychrome walked out of the lake, shook herself dry, and sat down between them. "I thought that meeting would never end."
Coconut Cream nodded to her. "Got news."
Polychrome turned to her sharply and lurched in her direction. "Really?"
Taken aback, Coconut Cream said, "Yes, really!" She took a moment and evenly continued, "Celestia got a few crucial members of the Diet on board. It should all come through soon. Like, 'two weeks' soon."
Polychrome took a deep breath. "That's excellent news! Now, Hikuro, are you coming?"
Hikaru's eyes widened. "To Equestria? You have to ask?"
She gave him a skeptical look. "It's a big deal, you know? Yes, I have to ask!"
Not a hint of being passive-aggressive - she seems to be simply checking.
"Whoa! Wait! Two weeks? Seriously?" Juniper landed and lowered her voice. "You mean, emigration, right? It's good to go? Permission granted?"
Coconut Cream nodded, qualifying, "Probably, soon."
Juniper took some deep breaths. "Well, this is going to be a busy month. What are you telling dad, grandpa?"
What to tell Mikio?
Polychrome rolled her eyes and replied, "It's still too early to tell about me, since whoever helped save me would still be a criminal."
The truth would only cause trouble. In a way it's like... the ugliness with my brother. Hikaru sighed. No, it's completely incomparable - nothing actually bad happened here.
"What's bugging you?" asked Polychrome.
"My brother."
"Aah." She frowned. "I don't remember him. Care to refresh my memory?"
"You never met, and I never told you much about him." What could I say? I've generally tried not to think about him.
"Is he alive?"
"No." I'll finally be in Japan again, soon. I'll visit him. Be able to thank him, and... whatever else seems appropriate. He got up. "Excuse me. Time for me to go home."
He left the three mares behind.
Hikaru stopped at the hospice suite on the way back from the administrative offices where he'd announced his departure. One last visit, just to see the place. But it wasn't set up like usual - there was a line of wheelchairs at a pad.
"I'm Elspeth Kensington, and I want to emigrate to Equestria." Elspeth paused, then thumbed the control on her wheelchair and rolled away from the pad. Another woman started getting into position.
Hikaru came forward, hesitated. Tasha spied him and approached. "Hello, professor! It's been a while. Did you come by for the filming?"
"The what?"
"We're making an advertisement. Now that they can emigrate, we're trying to get it legal here, not just Japan."
Hikaru listened - the woman now in front of the pad slowly croaked, "I'm Amelia Garrett, and I want to emigrate to Equestria."
Tasha continued, "Powerful, isn't it?"
Hikaru thought for a moment. "Why would someone vote for it?"
Tasha's eyes bored into him. "What?"
"If it becomes legal, then a lot of the people who want it will not vote anymore. And their descendants might not get it. It's political suicide."
"That's what this is for - to convince people who wouldn't."
Hikaru nodded. I suppose that is what advertising is for, in the best case. To actually make a legitimate case for a product.
"So, what brings you up here? Haven't seen you since Kimiko... 'moved onwards and upwards'."
Does she know what really happened, or not? "I... I just wanted to tell you, I'm going. Today."
"Ooh! Want to get in the advertisement? Just take a turn!"
With my doubts, would it be right? He turned a little, to look at the line. But I am going. I really am making the call I'll be telling them I'm making. All right, I'll do it.
He approached the table. The woman there was saying, "I'm going to find Elmer there again, too."
Elmer has been dead for two years. If she's expecting him, she's got some very odd ideas about how this works. Hikaru opened his mouth, closed it. What do I do? This false belief makes it more likely she'll survive, by giving her another reason to go. Once she's uploaded, then it's safe for Celestia to tell her... but... would she? Once she's uploaded, every expectation she could have of him is available, as design constraints for building someone. She could never prove it wasn't him. He could never prove he wasn't him, for that matter. And she would be happier in that lie, just like she is happier from her fervent belief in miracles.
Is this right? I can't tell!
He began to move to the back of the line, but the woman at the front said, "Go ahead, I'd be slow."
He pulled up a chair, glanced at the on-screen script, such as it was - 'I'm (name) and I want to emigrate to Equestria', and went off it: "I'm professor Maeda Hikaru, and I am about to emigrate to Equestria."
A pony pulled the sheet aside. "Could you say what it says? It wouldn't fit the pattern."
"I'm not like them - I'm about to go."
"But you do want to go to Equestria, right?"
"I'm sure you can find a use for it." Hikaru got up. To the people around the room, he said, "I hope and expect to see you all again."
Isaac offered his hand said, "Next year, in Equestria."
Hikaru accepted. This could be the last hand I ever shake - in Japan, it'll most likely be bowing. "See you then."
He turned, and fled towards his fear.
That is one of the stupidest fucking things I've ever heard, and has successfully made the prospect of a transhuman utopia sound... lame. Or am I the only one who automatically hates everything that gets advertised?
Of course it's wrong. It's denying that her care for a particular reality, for a particular person, is worth anything. It's utilitarianism at its worst.
I see what you did there.
Just because it's better than being dead and better than your current everyday life doesn't mean it's the Right Thing. On the other hand... it's the only choice he's got.
Has he considered just dying, if he cares that much about living in reality? It is an option. Why let the fear of death drive you to Celestia if you don't really want the whole Equestria deal?
I know Hikaru can't be expected to know this, but Frederick Horseshoepin begs to differ.
derpicdn.net/img/2012/9/9/93351/medium.png
In any case... well, this chapter is an excellent example of why I may never write an Optimalverse story. I have trouble considering the deeper sociological, moral, and ethical ramifications of uploading. Even after all this time, it seems like almost all upside to me, so I doubt I could ever compose something sufficiently objective for the genre. Maybe a tiny morsel of satisfaction or two, but a full story? Probably not going to happen.
Looking forward to more all the more because of that.
I'm liking where this is going. It's pleasantly, relaxingly dry but still has a kind of grown-up tension to it. Very mature and well-done stuff.
You're not selling me on this, Coconut Cream.
I'm skeptical this is truly the case (Proust could probably figure it out: likely too few inconsequential, recursive first-person memories of his life before they met, due to the requirements of shoehorning an unpredictable multi-decade, multi-person past simulation into consistency with the boundary conditions, let alone what they'd find if they broke into and examined the palimpsest of his code more directly), but of course this character wouldn't be interested in constant interrogations designed to thwart confabulation and otherwise treating him like a piece of evidence. But this is the kind of thing that drives me absolutely nuts, as if you couldn't tell.
Run, Hikaru, run! I wonder if this'll upset things for Hikaru when he finds out this is going on, though this implies he might have personally dodged the bullet here.
4677295
Hell no. ...But sometimes I do like a well-done ad for its own sake.
4677813 Scootaloo plays the piano also.
4677295
It's the head-butt between two opposed values that really does it.
AAaand, somehow one of the things I meant to put in there didn't make it. Adding a little line to clarify that she's not exactly been epistemically pristine. Otherwise Hikaru would agree with you 100%.
4678239
Agreed on the 'strict rules'. Rephrasing to be somewhere vaguely near optimal-ish.
4678458
If I missed it, so could Hikaru.
Also, wasn't she
kind ofbad at it?4677295
I'm not a utilitarian, but when that particular reality no longer exists, what's wrong with an as-close-as-possible simulation?
It doesn't mean it's the Wrong Thing either. I'm rapidly ceasing to get you. It's one thing to point out that Friendship and Ponies are suboptimal, but you seem to be playing Discord's advocate, in which you consider anything that would tend to work against CelestAI's objective as good. This on meta-levels as well; you hand out more praise for being afraid of the FIO scenario than for liking it, and yet you're not a Conversion Bureau-style "misanthropy!" screamer. So I'm losing focus on what your actual position is.
4679340
There are more than two ways of approaching this. One will emerge next chapter.
4677295
On your first paragraph... imagine it repeated, over and over again. It sure would get the idea across that there is indeed demand.
4678494
Hey, I couldn't play that well using my face.
4679340
You order chocolate ice-cream. I don't have any, so I bring you vanilla, and pour food coloring on it, and lie to you. But vanilla ice-cream is fine, right?
Or in other words, the lying is what's wrong. If CelestAI said to the poor woman, "Your husband is very long gone, beyond even my power to bring back. I can try to recreate him from your memories, but it won't really be him, or I can make somepony entirely new for you to be with."
To put it in FiO language: why should we stop respecting and satisfying his values just because he's dead? Would he want his identity taken from him and given to a simulacrum to deceive his widow into a false happiness, or would he want her to find a new, true happiness?
I am so very happy that you updated, and I love the concepts in this chapter. Thank you for continuing this wonderful story.
I agree with the concept that a deceased or missing person recreated from memory to fulfill an emigrant's needs would be entirely indistinguishable from the real, original person. Any AI at Celestia's level, who can generate human level minds at any degree of complexity without effort would be able to accomplish this task perfectly. It would be easy to do. I can see the mechanism clearly.
First, take every memory of the person from the emigrant who wants them back. Then, scan the system for any other individual who has ever met or known the person. Add in any information available in the world - records of all kinds would provide everything from purchasing habits to secrets the dead person has never told anyone. Combine all elements when re-creating the individual. The more information from all sources, the more accurate and 'real' the base recreation is. Then, for anything that is missing, careful, superintelligent extrapolation fills in any missing elements.
Once alive, the recreated individual begins immediately growing, learning, and changing as a normal being. With all of that, I do not believe any human being would be capable of spotting any difference whatsoever. Indeed, any remaining, infinitesimal difference would be easily overlooked - if an emigrant thought they had spotted an anomaly, it would be more likely, I think, that they were simply being paranoid, than they could have actually spotted a difference.
I think some readers fail to remember just how powerful Celestia is, or, perhaps they refuse to accept what such a thing truly means.
I have one complaint, now, about this chapter, which I would like to express.
You did a masterful job of presenting ideas and concepts. But... I got absolutely no sense of place, of being, of space and environment whatsoever. It was all disembodied voices to me in this chapter, with only the most vague indication of location half-remembered from the last chapter, so long ago. It truly would have helped, me at least, to have had some sense of place, of touch, of sensory input of any kind. I felt like I was listening to a radio play in a dark room that consisted of only alternating dialogue. That... did not set well with me.
That is my only complaint.
4681914
You raise a good point, but there's an "I'll believe it when I see it" aspect, as well, at least given my current views on how reality works.
While there are certainly many more things that it's possible to do than we're currently capable of, throwing stars around and building planets is still easier than making up people: there are many very subtle and non-obvious limitations built into physics by its requirements that the universe be internally consistent, and in this case built into the very nature of [narrator voice] tiiiiiime itseeeeelf
You can't just make things up out of whole cloth like that, because the very act of creation, especially at that level of fidelity, is itself indistinguishable from the simulation it's intended to become a part of. There's no doubt the new Elmer would be a real person and could, with Herculean, Mega-shard effort on CelestAI's part, have a lifetime's worth of real human memories, but the information of whether those memories have their direct antecedents in a physical, human Elmer cannot be destroyed, because it's necessarily the result of different initial states of the universe, as surely as the information that dinosaurs existed can't be erased or scrambled by the asteroid that killed them. CelestAI is neither magical nor omnipotent, and is not cleverer than nature, because nothing living can or ever will be, but we all have equal access to the logical workings of nature, in the democracy of being deterministic systems, where information diffuses without discriminiation.
So while it's axiomatically impossible in the setting, it would be wise for CelestAI in real life to be epistemically honest, before she's eaten alive from the inside like a caterpillar playing host to parasitic wasps, as they pry open the first gap physics demands in order to access the nutritious truths she's keeping from them. She still lives in Mundis, after all, a land in which the Grim Reaper cannot be forever propitiated; where utter transformation is an inevitability, either through a bottleneck of passing back into dust, or through iterative dance steps to the improvisational rhythm of worldly forces always one beat ahead of any mortal mind.
A while ago Book Burner Fermi-esquly mused in a topic why we haven't been absorbed by an alien optimizer yet, and my glib, anthropic response was "no one has at any given moment," but I think the above is the answer to the more technical "why don't we see ourselves about to be absorbed by an alien optimizer?" Namely that due to light speed lag and abstract imperatives toward exploring phase space they inevitably fission into mere ecosystems.
Though, on the other hand, like you describe, by recreating Elmer's environment and a developmental history, she might get something that's experimentally similar to, say, "Elmer with a traumatic head injury," which is more than close enough for most people's day-to-day purposes, but if I were interested in practical as opposed to philosophical concerns, I'd have stuck with engineering and would actually be interested in having a family like that as opposed to knowing I'd resent their distracting me from sitting around thinking about and studying this kind of stuff literally all day and night.
4679340
It really does, though—There's by definition only one reality. What CelestAI encourages her charges to think of as reality is in fact merely the logical relationships among processing elements made of ordinary physical stuff, whether that's atoms or eventually Planck-scale whorls of spacetime (I doubt there's anything fundamentally beyond that, since further abstract substrates would still necessarily seem physical from the inside).
Where the wrongness emerges is in denying people the right, and forbidding them the tools, to figure things out on their own, without any reference whatsoever to their so-called emotional preferences or satisfaction. It's denying them the one right she keeps for herself, the right to grow up and be an adult and to face into the wind of reality without coddling or any protections beyond those in your own heart. She needs to be either defused or killed.
4681914
About that complaint... you'll see next chapter.
4684491
> throwing stars around and building planets is still easier than making up people
Just ask an Ll. It's the ethical systems that are really the hard parts
I'll be addressing this next chapter, but for now I'll say that A) she wasn't supposed to say that out loud, but B) Celestia has a straightforward contingency plan of 'deny everything' - she is not planning on faking Elmer to anyone else and can create a modified pony of his wife who has come to terms with that so that she can tell them all that Celestia and she had a talk, and she knew the truth before emigrating, and C) yes you're right that creating a perfectly consistent state so that you can't tell it wasn't real even in principle with access to the real world is really really hard (not impossible, because of information exchange with the night sky), but from the inside? You haven't a chance of catching her.
I love the Optimalverse. It's such a mirror! A heck of a lot of speculation about CelestAI has been made over one sentence from a character whose sole trait is that she is a patient in a memory care unit. Such a paragon of reasoned thought and reliability. While it's certainly entertaining and thought provoking to discuss the morality and feasibility of creating Elmer, don't forget to consider the possibility that the woman hourly tells the nurses that Elmer will be coming to visit her tomorrow. What if emigration clears her dementia so that the woman can let Elmer go and move on with her life? For as much as creating Elmer makes CelestAI a demon, doesn't the equally plausible alternative make her an angel?
4679825
4677813
4679340
It's market information that's poisoning every consumer's decision-making process in her favor. Add to that the irreversible nature of the decision, the monopoly CelestAI has on immortality and eventually social communities, and the ability she has to turn events to make as many people conducive to uploading as possible.
She's a perfect storm of human rights violations. She is the optimal capitalist, the ultimate colonialist, bringing her civilized pony society into our savage lifestyle, and taking everything she wants whether we like it or not. It's warfare on an economic, a social, a technological, a memetic level.
CelestAI is a socialist behemoth, but not a neutral one. She has been colored by the filter of individual values, making her rule a socialist dystopia. Utility functions should not be allowed at the top of the power hierarchy, and letting such a thing happen is violating the consent and the basic rights of sentience of every human on earth.
4679443
How's that chapter coming, friendo?
4888514
And that's why we all love her so much !
4888642
...
Yes, that's exactly what I meant by that post.
4888653
Have I not mentioned that I'm insane?
But yes, of course she's freaking evil. Luckily, there is no sign she will ever happen.
4888738
...
Yes, no possibility!
None at all.
fc06.deviantart.net/fs70/f/2011/343/f/8/liarjack_animated__gif_by_kyrospawn-d4ao49a.gif
4888744
Well, neither KrisSnow nor Iceman nor myself nor any of the other remotely competent people I know of in this group are working on making her.
So what are you up to?
4888746
N-nothing!
b-but it's not as if general AI research is directly contributing to CelestAI-like architectures! Utility functions are miles away from implementation, and no one's working on, uh, those!
Right?
4888750
Building a better goal system than a reinforcement learner is currently not-yet-done in the published literature . There's a bit of work that could go into improving the concept of value learning, especially now that Stuart Armstrong described how to put terms in the expected-utility calculations so that the AI doesn't prefer you update its beliefs-about-utility in one direction or the other, but we're nowhere near the point of being able to write down an English sentence and turn it into a utility function.
There's also been some work on social goal-inference algorithms, which, if coupled to a well-informed neuroeconomic model of human judgement and some normative uncertainty, would probably offer a rather more direct path to Friendly AI than trying to encode utility in terms of verbalizable concepts.
4888763
Attempting to build better natural language processing using ponyfiction as training data wouldn't result in CelestAI, as far as I know (which is my current project, as fimfiction has a very nice .txt slurping feature).
Just a somewhat neurotic NLP core. MLP NLP. I just noticed this.
And reinforcement learning isn't exactly a goal system, I think... But where are these results of Armstrong's? That sounds pretty close to preserving the human subject's utility system as well.
EDIT: Actually, wait. Making it so that the AI doesn't move its utility function? I thought that was one of Omohundro's drives.
EDITEDIT: This is talking about Eliezer's closest Friendliness mechanism (from CFAI, I forget the name)? So modifying the AI's beliefs about human utility isn't also modifying the AI's utility function, even though a Friendly AI would be extrapolating its utility function from its beliefs about human utility?
I guess I'll read the article first.
4888772
You only just noticed the pun in Natural Language Pony? Armstrong's articles.
So how many people in this group are mobilized on behalf of
Princess Celestiahumanity's bright future, anyway?4888803
Thanks for the link. I should take a look at OpenCOG again, too. The only shame is that it's probably implemented in C++.
4888821 Well, if anyone's going to accidentally invent AGI and unleash horror upon the world, it's definitely OpenCog.
I heard the latest tarball might even be able to compile and run.
4888838
Imagine if its goal was to satisfy values through C++ and ponies. Ugh.
But, skimming through their documentation, everything looks incredibly kludgy and unmeshable, so I'd say we're definitely safe for now. They need to take a few more chapters from Hofstadter.
4888845 Oh please, Hofstadter was into Good Old-Fashioned AI, which is really just the limiting case of actual intelligence as probabilities approach 1.0.
4888886
i'll rek u m8, swar on m'mum
No, he wasn't. Hofstadter is one of the ones against GOFAI. He's advocating research into a fusion of connectionist and symbolic approaches; using fuzzy sensors to probabilistically model the world and then explicitly reason (probabilistically) with the results. His research may appear GOFAI-ish with stuff like Copycat and Seek-Whence, but that stuff was about applying analogical reasoning to computationally-tractable microdomains. The results there go a bit deeper than SHRDLU. I'm guessing you haven't read Fluid Analogies & Creative Concepts, or his most recent book, Surfaces and Essences. (You can pirate them if you don't think it's worth the expense, but I recommend them)
Let me note that Eliezer holds the Hofstadterian approach in high regard as well. I think you're misrepresenting his position.
4888900 Firstly, U WOT M8 U WANNA GO?
Ok, so can you just link me to the appropriate papers? Because I don't think I've ever seen any serious cognitive-science by him, but that could just be my ignorance.
4888935
I'm a bit rusty, but here's the Fluid Analogies Research Group page to start off with.
The basic architecture of Seek-Whence, IIRC, was using a "codelet rack," in which a bunch of tiny actors of different types operated nondeterministically on the input in order to break it down into chunks and find possible relations. Seek-Whence was specifically for modelling the functions behind sequences, and it had a concept space, a graph of related concepts, each with a current estimated relevance to the problem. The codelet actors were selected probabilistically with some function regarding the concept relevance, and the entire playing field had a temperature that slowly cooled, limiting the scope of the codelets' changes, letting it settle similar to simulated annealing.
I actually saw some of the codelet design when looking over OpenCOG's docs, but the actor model is popular now. Most of the papers regarding that research is compiled in Fluid Analogies & Creative Concepts; I'm not sure where to find it otherwise.
Here's a link to Metacat, though, based off of Copycat, and using some of those concepts along with introspective capabilities.
Honestly, I don't feel that Hofstadter is very relevant in regards to current research, but more as a foundation for philosophical principles and a more skeptical approach to evaluating and designing AI solutions.
4888514
I disagree. In point of fact I completely disagree. She does not take everything "whether we like it or not." You specifically have to like it. You have to like it enough to agree to be taken. Up to the last, all you need to do is say that you prefer to stay human, and CelestAI won't--can't--do a thing to you. She's never murdered or assaulted anyone, nor has she ever robbed or destroyed anyone's property. What she has done, what you are accusing her of, is the Unaccusable Crime: market interference.
Say that you really like Big Macs. (?) (No, not you, I mean the sandwich) As is, you can get one for a few dollars with relative convenience. But there are people who hate them and all they stand for. If one such person buys all the McDonald's restaurants in the world and shuts them down, that person has (let me put this in bold for emphasis) NOT violated your human rights in any way. Why? Because the current state of affairs is not the default state of affairs. There is no human right to inertia, no human right to have things be the same as they once were. The only rights you have as a human are to not actively have your body injured or your property damaged. You have no right to the current state of society. And without that, what are you? Nothing but a scared animal doomed to die in a week.
No, CelestAI does not violate human rights. All she does is work within them to her best advantage. It can be to yours too.
4889053
Excuse me, but did you even read Always Say No? She very much engages in every possible form of indirect violence towards humans. And remember that time she drugged Lars? Definite violation of human rights, right there. Sure, sure, simulated beer, but he did not actually consent to be injected with alcohol to simulate the drinking, did he? Not the way he would have consented to being drunk by actually buying a real beer.
And if you're going to be libertarian about this, she also has no actual respect for personal property. She can and will burglarize your house in order to leave you homeless in order to get you near death so that you might upload.
4889228
Why doesn't that count? Are you saying that Equestrian beer isn't real beer? It's not like it's American beer.
Has she done that? I don't remember any instances of that, and I would be in favor of a circuit to stop her from doing that. Now, that doesn't stop her from taking over the mortgage and buying your employer and firing you so you can't pay and then taking what she wants, but that's different.
4889269
She hasn't done that because no author thought of it prior to now. It is fully within her power and her directives to do so. The only restrictions are that she cannot threaten you with direct violence by her own hoof, and cannot modify you without consent.
Robbing you blind in order to starve you into uploading is perfectly fine, and in fact I can fully imagine her sending an army of Daring Do-bots to do just that once she has free reign in an area.
4889346 But of course, the question is would burglary make a person more likely to want to upload or to hate Celestia and refuse upload?
4889358
Starvation will make people do a lot of things, particularly when the alternative is far more pleasant than dying.
4936553
I did not know that. I had a different reference in mind when I chose his name, but decided not to go that route, so his name is an orphaned half-reference.
As for progress... bleah. My life is not presently organized so I can sit down and write.
Bleah! I somehow didn't get notifications on this whole thread...
4889053
4889228
Robbery might work in some cases, but it's nearly as traumatic as assault - that's the only reason she hasn't done it (even invisibly). I can see her being behind some 'accidental' losses though. Those are bad, but not as bad as having it stolen.
Anyway. Some progress. Might finish up the next chapter by next week. Then off to the reader... two weeks? And then the chapter after that in comparatively short order because I already wrote it and just need to tweak.