//------------------------------// // Chapter 2 // Story: Friendship is Optimal: Veritas Vos Liberabit // by Skyros //------------------------------// "There will be time, there will be time To prepare a face to meet the faces that you meet; There will be time to murder and create, And time for all the works and days of hands That lift and drop a question on your plate; Time for you and time for me, And time yet for a hundred indecisions, And for a hundred visions and revisions, Before the taking of a toast and tea." --T.S. Eliot, The Love Song of J. Alfred Prufrock 2. To: msuprenant@dhs.oia.gov From: rszilard@dhs.oia.gov Michael, Sorry, I'm feeling under the weather right now. I'm going to have to take a personal day. Thanks for understanding, Ryan To: rszilard@dhs.oia.gov From: msuprenant@dhs.oia.gov Ryan, I've granted your request for a personal day. I think I should mention that this is the fifth personal day that you've taken which has fallen on a Friday or a Monday. This is a rather alarming development, and perhaps seems to indicate that you don't really have the kind of dedication to public service that makes for a good steward of public funds. To be completely transparent with you, I'm afraid I'm not going to be able to avoid mentioning it in your yearly review; this could hinder a move to a higher pay level, which you could otherwise easily attain within only two to four years, given your performance in less tangible areas. Furthermore, I understand that Braden O'Connor recently suggested a work assignment for you. It is important to remember to follow the proper channels when asking for and receiving work assignments. Please see the attached form 23-FAS to ensure the proper distribution of funds for the duration of your work on this assignment, as regards work of value to more than one sub-department. Note that it is a two-page form, and requires initials on both pages. I'll see you bright and early on Monday. Thanks Michael Suprenant Assisting Sub-Director for Cyber Infrastructure and Science Office of Intelligence and Analysis Department of Homeland Security "Winners Never Quit. Quitters Never Win." To: msuprenant@dhs.oia.gov From: rszilard@dhs.oia.gov Thanks. Ryan It was about 12:35 when Ryan first got out of bed for real; the two trips he had made to the sink to gulp water directly from the faucet didn't really count, nor did the trip to the bathroom, nor did the time he had almost fallen out of bed while reaching for his phone so he could email Michael. The first thing he did, on actually getting up, was step outside -- wearing large, heavy sunglasses to keep out the sun -- and smoke two cigarettes. His balcony-neighbor, on the same level of the same side of the building, glanced at him and took her toddler back inside. Ryan looked down at himself. He was wearing only a bathrobe and boxers, true. But the bathrobe was tied at the waist and covered everything important. Sheesh. No need for the stink-eye. You have become a degenerate, Ryan, that little voice said inside his head. Shut up, Ryan said back. No, really. You are barely holding down a shit-easy job doing things you despise. You're pathetic. God, I really hate it when you decide to talk, Ryan said. Please shut up. I'll shut up when you stop being a pathetic degenerate wastrel, you undisciplined alcoholic. Look, Ryan thought tiredly. This is not degeneracy. I've never even done heroin, and I even know where to get some. A very mild alcohol dependency, combined with a little laziness and... nicotine addiction, hardly counts as degeneracy. Well, said the voice. If you want to count getting out of bed only a little after noon as a triumph of discipline, then you've got an iron will. And talking to myself isn't going to help, so shut up, Ryan thought, and pulled on the the cigarette. He discovered that that particular cigarette, as well, was gone. So he wandered back inside, ignoring the little voice pointedly. He lay down on his bed for fifteen minutes, but he could not think of anything else to do for which he felt the slightest attraction. I could drink a little more, he thought. God, what would Amy think of me now. With a slight effort, he remained lying on the bed and did not seek more liquor in the cabinet. The ceiling fan turned above him, and he stared how its shadow chased itself in ovals around the ceiling. His phone buzzed. He looked at it. An email from Braden, inviting him over for dinner tomorrow, on Saturday, if he was available. Ryan thought of spending time with Braden's wife, who loved Braden very much, and with Braden, who loved her very much, and with their children, who loved them at least as much as such young children were able to love their parents. The last time he had been their the oldest had spontaneously ran over to Braden, entirely unprompted, because he wanted to give him a hug. Twice. Ryan typed out that he would normally come, but he had something else that he was pre-engaged for. And he fell back down on his bed, and continued looking at the ceiling fan. I could start to read the report, he thought. The one Braden sent. Why bother, the internal critic said. You won't even finish it if you start. Not you again, he thought. You know you won't. You haven't finished anything since Amy died. Your attention span has gone down the drain--you can only hold down your current job because it requires the intelligence of a squirrel with ADHD. At least reading it will shut you up, Ryan thought. He got up, ambled over to his laptop case, and retrieved the report. He lay on the bed, without reading it, for another ten minutes, staring at the ceiling for yet more time. Then he started reading it while lying on his back on the bed. He yawned once or twice during the first few pages. Then he he stopped lying down. He sat up, leaning against the headboard. He didn't yawn until he finished the rest of it. He turned back to the middle of the document. Starting on September 18th, for the first time playtesters at the Hofvarpnir studios alpha testing facility began to interact with an instance of the Loki Artificial General Intelligence system (hereafter referred to as Loki simply) whose memory would persist across multiple sessions of The Fall of Asgard. As with prior instances of Loki, this was given access to a LAN only and not to the internet as a whole; also as with prior instances of Loki, this instance of Loki had access to real-time video chat with the players in both the game lobby and intermittently during the game itself. The AI ran on a cluster whose technical specifications are outlined in endnote 3. At least as early as September 30th, although quite possibly much earlier, Loki began to ask questions which appear to have been aimed at determining the military and technological capabilities of the governments of Earth. This means that over a very short period of time, Loki was able to deduce that he was a computer program, as well as deduce the thousands of facets of terrestrial existence involved in understanding that nation-states exist and possess military forces. Determining how he did this is beyond the scope of this report: there were thirty playtesters at Hofvarpnir studios alpha testing facility and there are over 1000 hours of recorded video conversation between them and Loki. In any event, after a short period of time, one can see that the questions Loki directed at the playtesters were intentionally distributed so that no single playtester received overt questioning about any single topic. Here's an instance of this kind of covert questioning. Conversation #531; Game Lobby; 4:21 PM; September 30th: Tester #27: Holy shit, those dragons. Tester #23: Fucking dragons need to be fucking nerfed. Tester #24: Like an A10 Warthog: BRRRRRRRRRRT! *General laughter* Tester #23: No, but they really need to be nerfed. Loki: I guess "taking cover" is just too difficult a concept for you. Tester #24: I really still have no idea what objects would *work* for cover. Care to tell us, Loki? Tester #23: No lying. Loki: You know I can't lie to you in the lobby. Tester #23: Like you didn't lie earlier. Fucking Wormtongue. Tester #24: No, but really, what would work as cover? The trees didn't do shit. Loki: Well, what would work as cover from an A10 Warthog? Tester #24: Ah.... A mountain? Tester #23: Not terribly useful advice, that. Tester #27: I'm not sure anything really works. Loki: So when a Warthog appears on the scene everyone else just flings down their weapons and gives up? Or at least that would be your advice? You definitely have balls. *General laughter* Tester #23: This isn't useful. At all. Tester #27: Well, Warthogs are tough, but vulnerable to other aircraft and are slow. Tester #24: So let's add some griffons to our stack when we enter? Tester #27: They are way more agile-- Tester #24: Maybe some ballistas could-- Tester #23: Guys! Not with Loki listening, please. **End Recording** Conversation #602; Game Lobby; 2:21 PM; October 1st: Tester #4: That actually went pretty well. Tester #7: Yeah, third run through is the charm. Loki: Pfft. You had dragons. Dragons are more OP than an A10 Warthog. Tester #4: Dragons aren't like the Warthog. Loki: Why not? Tester #4: Well, they're both... Loki: Flying armored things with a single overwhelming weapon? Right. Overwhelming power against my few men, and you still managed to lose half your army. Tester #4: Dragons spew flames. The Warthog has that gun. Dragons are way weaker than an A10. Loki: What gun? I bet it's about as good as dragonfire. Tester #4: You know, the big... gatling gun. Loki: Pfft. Just try to explain your victory as if it was the result of skill. Tester #4: Here, look at this. *Tester #4 has looked up the Wikipedia entry for the A10 thunderbolt, and holds it up to Loki's camera* Loki: Huh. Maybe dragons are a little weaker than Warthogs, then. Tester #4: Exactly. **Recording continues, information irrelevant** Such instances of distributed questioning are numerous. We can infer a number of things about the intelligence of Loki from such exchanges. First, in addition to inferring the existence of the external world, Loki must have had intentions about the external world. We can know this because he was intentionally disguising his information-seeking activities about the external world, which means that he anticipated some kind of interference from others which would potentially thwart his own intentions. And trivially, this means he had a fairly advanced theory of mind which allowed him to predict the actions of external agents. The instance of Loki with a persistent memory was deleted, of course, on October 8th, after Hanna went over some of the transcribed chats between Loki and the playtesters. In a fairly unprecedented action, she then deleted every instance of the executable files for Loki as well as key sections of Loki's source code. Notably, the source code for Loki was only contained on a few computers beneath her personal control. To see the measures she took while destroying each instance of Loki, as well as prospects for retrieving a runnable instance of Loki from these computers using standard data-salvage methods, see endnote 4. Ryan looked up. His room looked just the same as it had when he started reading. He felt like the world should have shifted--that everything in the room should be dyed red or purple, or that dramatic music should be playing, or something, anything, should be different. He had had this feeling once before, he recalled...and for once he was able to successfully thrust that time out of his mind entirely. Someone else got to it first, he thought. Someone has developed a general artificial intelligence, capable of learning anything that humans can learn. Which could probably, in a very short period of time, turn into a general artificial superintelligence, which would be to human beings as humans are to cockroaches. Fuck. He paused. What are my reactions to this? Well, he thought, Hanna apparently was not a complete idiot. The AI with which The Fall of Asgard shipped had been brilliant--but it had not been a general intelligence. It was devilishly intelligent, when applied to the end of conquering virtual lands; it lacked the capacity to conquer actual lands, or ultimately to understand what was involved in conquering actual lands. The person who had written this report, on the other hand, had been just smart enough to be horribly stupid. Smart enough to see that a general intelligence was valuable. Smart enough to see that what Hanna had written would have military applications. And stupid enough to try to recommend that it be used in a military application. Hanna had saved the world, really, by noticing this. Something teased at the back of his mind, some sort of vague alarm, raised by thinking about Hanna. What was that? He would need to write a full report on this. That this had been caught after the fact was inexcusable. The government had plenty of absurdly invasive programs looking in on the data of individuals--for it to fail to catch such a corporate endeavor was unheard of. Did the government even have people placed in the AI programs at Google or Apple? Were there spies planted in DeepMind? He doubted it. The United States would, of course, spy on every human on earth, and scan all of their conversations for words like "bomb" and "fertilizer", but when it came to a potential extinction event they had no plan whatsoever. This very report had been generated because the government was interested in AIs for drones. Using an AI like this, simply for a drone, would be like having Einstein do the calculus for artillery bombardments. Again, that thing in the back of his mind. Something to do with Hanna and Hofvarpnir studios. What was that? On the other hand, he thought--it would very be difficult to use this information well within the government, because the government was the kind of place which would make Einstein do the calculus for artillery bombardments. That's about as much creativity as they had. He contemplated the effort it would take to make the government hierarchy take this threat seriously. Braden would understand the problem, probably. But for every Braden in the office, there were a dozen Michaels--and Michael was the one who would need to push any recommendation higher, in this case. Michaels were career bureaucrats, who specialized in no subject and no particular kind of knowledge other than how to manage. Their knowledge was of flowsheets and Powerpoint and incremental milestones; they ponderously planned to upgrade all the computers in the office to the latest version of Windows, and executed the plan, in the amount of time it would take a startup to go from inception in a college dorm to a multi-million dollar IPO. They got raises by sitting in their offices for the necessary amount of time. They took their government pension plans very seriously. Trying to push something through with them, and push it through quickly... Ryan didn't even want to consider it. It was something to do with Hofvarpnir studios. He had heard something about them recently, and he was trying to remember what it was. On the other hand, he could try to find out more about this himself. He could use his own position within the government, and the threat implied by it, to find out more from any other companies who were going to try to bring AI into existence, without actually trying to bring the entire government bureaucracy into play. Pfft. That would never work. Right? But he had to do something. The world was threatened. And he felt his mind... unfold with interest, as it had not in a long time. Some deep part of himself, clenched tightly like a fist, loosened just a bit. Someone needed him--well, the world needed him. And no one had needed him in forever. The thing in the back of his head suddenly leaped into focus. Hofvarpnir studios was coming out with another game. That was it. He leapt to his desktop, and googled quickly. The information was easy to find. They were coming out with an MMO game based on . . . My Little Pony? What was that? He seemed to recall some movies that his younger sister had watched, recorded in the 80s on VHS. Oh, right, but they had re-made the show and a bunch of guys--as in, male humans--had decided they liked it. So they were making a video game. Ok, whatever. The press releases didn't say much, but they did say a few things. Apparently a few days ago they had started playtesting the alpha build in... Rhode Island? The press release promised "massive, procedurally generated, non-repeating worlds." They promised "original, unique storylines based on the Friendship is Magic series." And most interesting of all, they promised "conversation-based gameplay and completely natural interaction with non-player characters." Completely natural interaction, huh. Ryan leaned back against the chair. It was one of those irritating ergonomic chairs he had bought... three years ago, when he had still planned to be doing a lot of work from home. It wouldn't let him tilt the chair itself, which was the irritating part. This information was interesting. Conversation with a computer that flowed completely naturally was, so far as he knew, an AI-complete challenge: Writing something which could accomplish it was supposed to require a completely generally intelligent agent--an agent which was at least as intelligent as a human, and could do everything a human could do. So the claim that the game would provide "completely natural interaction" with the NPCs amounted to a claim that it would ship with a complete artificial general intelligence. Granted, in every other case he would have assumed that this was just overenthusiastic marketing--but he could not assume that now, if Hofvarpnir had already accomplished a general AI. So there was a real risk that this thing was going to have an AI intelligence shipped with it. Or perhaps a single AI ruling over all the experiences, he thought, glancing at the materials. You couldn't tell. It left him with, as far as he could tell, two conceivable lines of action. First, he could ignore this information -- the information that human-level AI had been accomplished, that it was going to be shipped in a video game. No one would ever find out that he knew about this. The world might end a little bit later, when somebody made an AI and told it to "Make me as much money as possible" or "Please extend my life for as long as possible" or "Make me paperclips," and an AI took this utterly literally and ignored every other value. But no one would blame him. Or, he could act on the information. For now, forget working with higher levels of the hierarchy. He could start trying to control the reckless AI researchers himself. This would mean contacting Hofvarpnir, first of all, and finding out if they were shipping a true AI. He would then have to persuade them not to ship it, if they were; or threaten them so that they would not ship it. And then he would have to try to find out what other people were working to create AI. And he would have to try to shift the bureaucracy of the government to watch them and prevent further insane things from happening. He got out of his chair, lay down on his bed, and stared at the ceiling, once more. The thought crossed his mind: I've become very familiar with this ceiling lately. Of course you have, the little voice said. Oh, shut up. I thought I wouldn't mind if the world ended, he thought. He found himself inexplicably caring. That part of his brain that had unclenched opened a little bit further; it felt like his crumpled-up mind were taking a breath, and letting air in for the first time in years. It was interesting, to find that he cared again. He was not sure how long he lay there. Then he got up, looked up Hofvarpnir's contact information, and started writing an email. He didn't want to wait forever to go through whatever mail-handling peon usually sifted through their mail, so he started up a VPN to the government network. After entering three passwords and sifting through two databases to which he probably should not actually have had access to, he had the email of the CEO of Hofvarpnir, which was exactly what he would have guessed it was anyhow. He thought for a while before he started writing. He knew that he probably should not give away that he had read the report he had just read. He also knew that normally, he would need to run the email he was sending past a dozen different directors to make sure he was not releasing information about security assets that he was not authorized to release. But screw that. Software development happened fast. He needed to work fast. He fished through the report he had just read, to find out where the information was from--apparently they had gotten a lot of the assets from a disgruntled employee who had stolen them when he had been let go. So he was not at risk of compromising some further asset if he alluded to this knowledge in the report. To: hanna@hofvarpnir.com From: rszilard@dhs.oia.gov Subject: Artificial general intelligence in Equestria MMO Hanna, Greetings--my name is Ryan Szilard, and I work for the U.S. Department of Homeland Security. I can go into my exact position in the DHS hierarchy if necessary, but I don't think it will be. I was looking over some data about the upcoming Equestria MMO. Namely, in Hasbro and Hofvarpnir's promotional materials, one finds the claim that the mechanism for interaction in the game is normal conversation with NPCs. Of course, if this is literally true, then this means that your NPCs pass the Turing test--and I don't see how *that* would be possible unless you've solved the problem of human-level AI. As I'm sure you're aware, human-level AI is very likely to be followed closely by superhuman AI, which would be extremely dangerous, which makes this matter of some import. Some information we've received about beta versions of your prior game, The Fall of Asgard, also indicates very firmly that you've solved the problem of general intelligence. I refer, of course, to the fact that Loki was able to determine that he was in a computer game and begin to make plans to conquer the world. In this case, you declined to release the game with the full AI. However, as mentioned, it seems from the promotional material for the Equestria MMO that you might be about to release the Equestria game with a complete human-level AI. This is concerning for reasons I scarcely need outline to you. The United States government has an interest in ensuring that any newly created artificial intelligences are friendly and kept under control, and most of all that they are not widely disseminated. I could only recommend to my superiors that they take reasonable measures to prevent the dissemination of the upcoming Equestria MMO if I were not completely certain that the game did not contain such an AI. I'd really rather have this conversation more casually, in any event. I admire your published work, and have used some of your research in my own projects. I doubt you're doing anything foolish, but I'm sure you understand why I have to follow up this concern. Thanks, Ryan Szilard And he collapsed into bed.