As X, Octavia, Zero, and Alia returned from their beach vacation, Alia went immediately back to her post at comms in HQ. "Anything new while I was away?" she asked as Zero leaned against a nearby wall. X huddled in a corner, shivering slightly, much to Octavia's amusement. She wondered how things at the beach would have gone different if those girls had known it was actually X there, and not the pseudonyms they'd used for an attempt at anonymity.
CiCi called up the report to answer Alia's question. "Maverick activity is at an all time low," she explained. "Apparently, this new Reploid scientist, Dr. Doppler, has built a Neural Computer that can suppress Maverick impulses in Reploids."
"Really?" Alia asked in shock, looking over the details. "That's impressive."
"And not morally questionable or familiar in the slightest," Octavia added with a roll of her eyes and a theatrical sigh.
"Eh?" Zero asked, confused.
"And you wonder why Dad doesn't let you tutor me, Uncle," Octavia replied teasingly. Turning back to Alia, she continued. "Think about it. What makes Reploids distinct from robots? It's the absence of the three laws being hardcoded in their minds, meaning they have true free will. They can choose what actions they take. They aren't slaves to their human makers as robots were programmed to be. So...what does it say if the solution to them going rogue is to send out a signal that alters their positronic brains to make those rogue impulses impossible?"
X blinked. "Okay, I see what you're saying, Octavia," he replied, standing up. "And I have to admit, I agree with you as far as using it as a preventative measure. But if a Reploid has already engaged in Maverick behavior, surely this is a better corrective measure than locking them up or destroying them?"
Octavia coughed loudly, the cough managing to sound remarkably like 'Clockwork Orange'. Zero, who had found the cache of pre-End media and shared it with everyone, burst into laughter. Once his laughter died down, Octavia continued. "But that's not the real issue, Dad," she explained. "Even if it weren't morally questionable to mess with a Reploid's mind to make them behave, this Doppler created the Neural Computer, and is in charge of it. That means one person is deciding what constitutes Maverick impulses as far as being suppressed. Can you think of anyone you'd trust with that kind of power over every living Reploid in...however great a radius that thing works on?"
"You," X replied easily. "You do it already just by pouting."
Every Reploid in range to hear burst into laughter as Octavia blushed and pawed at the air with her front hooves, plainly flustered. X and Zero even exchanged a high five.
"That's beside the point!" Octavia finally managed to shout out. "What do we know about this Doppler? How do we know he can be trusted with a machine that can rewrite positronic brains remotely? How do we know it stops at suppression? And besides that, what if someone nefarious gets a hold of it and changes its purpose?"
The laughter quickly subsided as all gathered began to think about this. Alia quickly glanced through the report. "Well, according to reports of our agents who've actually examined the Neural Computer's functions-"
"While in range of its influence," Zero countered, frowning deeply.
Silence reigned. Octavia climbed up onto the console. "And then there's my point about it being familiar," she added.
"Familiar?" X asked in confusion. "When have we heard about anything that could remotely...influence..." His eyes widened.
"Sigma!" X and Zero shouted together.
"Precisely," Octavia agreed. "His original design included hardware that could remotely influence other Reploids. The discontinued Leader model tech? It sounds like this Neural Computer works on the same functions."
X scratched his chin thoughtfully. "I think Zero and I should take a little trip down to Doppler Town," he said finally. "Give this Neural Computer a once over ourselves."
"Are you crazy?" Alia demanded. "You two are our best Hunters! Nobody here could possibly beat you if you went rogue, and you want to put yourselves in range of a device that could possibly twist your brains around and turn you into living weapons?"
"You forget something, Alia," Zero indicated with a grin. "X and I aren't Reploids. We're robots. Our brains aren't positronic. The signal from the Neural Computer won't affect us anymore than Sigma's tech did."
"And what if you're wrong?" Alia pointed out. "What if it's not limited to the positronic brains of Reploids? What if it can affect you two?"
Zero's enthusiasm started to dwindle as he gave that serious thought.
"Send me," Octavia suggested. "My brain's organic. Varia's brain doesn't even use circuitry. The two of us can't be effected by a signal like that. Even if it can affect Dad or Uncle, it's a physical impossibility for it to affect us."
"No!" X, Zero, and Alia all shouted at once.
"If we're right, what happens if you get captured?" X demanded.
"Or outnumbered and hurt?" Zero added.
"Or...worse..." Alia gasped quietly.
Octavia shrugged. "Send CiCi then," she allowed. "Her etheric systems render her positronic brain hardcoded. It can't be remotely altered. And she's a Navigator. No one will even question her showing up to 'take a survey' for Maverick Hunter HQ."
All eyes turned to CiCi, who shrugged. "I don't see why not," she allowed. "I've already collected a sample of the Neural Computer's signal via satellite and tested it against my own systems. It has no effect on me."
"Alright," X agreed, seeing he was still acting Commander of the Hunters. "Be careful in there, CiCi."
"Of course," she replied with a mocking bow before slithering to the teleporters.
Alia went over the report some more. "I hope we're wrong," she murmured. "Several very powerful Reploids have already flocked to Doppler's banner, helping him to build Doppler Town, a city free of Mavericks. A great number of Reploids have moved there to live peacefully."
"How many powerful Reploids?" Zero asked.
Alia made a quick count through the report. "Of those with combat abilities rated above Class 4...eight."
X sighed. "Everybody get ready for combat," he muttered. "Push up the training regimen for the new recruits, and issue the siege armor."
Octavia smiled, glad to see she wasn't the only one who saw the pattern there as the Hunters prepared for what was sure to be yet another war. She couldn't help but giggle at what X said as he went to his own training.
"Can't the world stay saved? Even for like...ten minutes?"
If it was saved for that long you would be out of a job X, like every other hero
And so it begins again...
That's not the only thing that's familiar: A thing happens, everyone either thinks it's because someone else did it or that it's a completely random occurrence, except the one behind it is exactly who the player suspects it of being? Let's see... 3, 4, 5, 6, 8, 9, 10... yeah, this isn't exactly a revolutionary turn of events.
Now then, let's see what your next move is...
The line at the end reminded me of The Incredibles. The world will always be facing some crisis and if it gets quite for around a week everyone starts expecting something big.
6632721
>Remembers original shitty english dub
>Listens to remake
BRETTYGOOD :DDDDDDD
i.imgur.com/cK4001j.gif
Not as long as paranoia is around.
pre05.deviantart.net/45df/th/pre/i/2015/241/0/1/everything_is_fine__not_by_thegreatrouge-d97lhfr.png
6642924 that'd be a dream com true for X actually.
6643078
Where did you find that?
Another enjoyable chapter. Well done.
6643088 here http://thegreatrouge.deviantart.com/
6643078 It's so freaking true! And it's always sigma.
I think the more accurate term would be reploid prototypes(generation 0.5). They were both still designed with the concept of free will in mind.
Someone's getting genre savvy.
You know there's actually a problem with there being a difference between robot and reploid. If you simply remove the three laws and program the robot to think for itself than you have essentially created a reploid WITHOUT any complex positronic brain.
I'm not poking holes in your story. Its just something I noticed.
Also, I never liked the concept of the three laws. They are nothing more than a bit of programming that can be changed, rather easily, but they are always treated as a set of unbreakable laws of physics until after they are broken.
Like in I. Robot it never accured to anyone that a robot's programming could be corrupted in any way until after it happened. Just once I'd like to see a robot story where it has already happened, and people now take precautions.
I love the Incredibles reference at the end, there.
"I was cured, alright."
*Has not seen Clockwork Orange*
Pretty sure CiCi, Octy, and Vari got this
6643428 yeah, the Three Laws wernt corrupted in I Robot, VIKIs attempted take over was based around the Three Laws, only a different interpretation of them. the Laws were intended to prevent a Robot from harming a Human and to protect Humankind. however, the most logical way to ensure the protection and survival of Humans by Robots is for Humans to be ruled by Robots.
VIKI wasnt lying, her logic was undenyable, but it lacked the one thing that Humans have over Robots, our ability to go beyond logic. something that the VGer entity learned to do in the first Star Trek movie
And this makes me think...
6643946 You have taken my words too literally, and almost missed my point.
The laws themselves are stupid. In I. Robot everyone thought the laws could not be broken, but all it would take is an error in the programming, a glich, and the laws would mean nothing.
The three laws of robotics are treated, in everything that I've seen them used in, as if they cannot be broken until after they are broken. Even now I'm sure that there is someone writing a sci-fi novel with that as the plot, even though it has been done in every book, movie, comic, and so on, that has used the laws as a plot.
My point is that none of these plots ever consider that the laws could be broken, with the exception of the hero who is called paranoid by other characters. The world the story takes place in thinks that the robots are literally not capable of harming humans just because they made some program that tells the robots not to.
If I. Robot had happened in real life then sometime in the past the laws would have been broken for one reason or another. And as a result, we would have taken precautions against it happening again.
I hope I have made my point more clear as I do not want to fill Tatsurou's messages with stuff that doesn't relate to his work.
6644027
In the original novels by Asimov where the laws were penned, it was implied that it isn't just one program in the robot's CPU that holds the laws, but numerous redundancies within its decision making logic of the if/then sequence that constantly double checked if the action being considered violated the three laws, and what type of robot was being built determined how hardened in the coding the laws were. (For example, a police bot's would be balanced less towards obeying a human to avoid it accidentally obeying the order of a suspect or criminal). The book I'm thinking of (can't remember the name) implied that a robot that witnessed a crime but was unable to act would actually lock up do to the clash between 'obey order' and 'prevent harm', possibly even causing a programming error systems crash.
And that's the whole problem with the first Zero game.
Cici gets some more time to shine! Yay!!! I can't wait to see how she fares and how many reploids are traumatized by the time she's done.
6644047
Thank you for the info, but even then at some point there would be a major systems error that caused the robot to not follow the three laws.
Bender from futurama is a perfect example of this. The three laws were confirmed to exist in the show, but he doesn't listen to anyone.
6644027
6644047
im afraid here, i must speak up and offer some clarity. it was not a programming error that leads these robots in the stories to rebel, but a self imposed logical error. lets look at the Laws, and see if a program could affect them
1) a Robot cannot harm, or by reason of action allow to be harmed, a Human
2) a Robot must obey the orders given to it by a Human so long as they do not conflict with the First Law
3) A Robot may defend its own existence, so long as the action does not conflict with the First or Second Law
on the surface, the Laws seem to be perfect, but like so many perfect things, they arnt. logically, if a Humans actions r bringing harm to them, a Robot could use the First Law to halt that action. that could mean taking the Human out of a dangerous situation, impeding the actions being taken, or terminating the problem altogether along with some Human life. if the Human attempted to use the Second Law in response to this termination, then, again logically, the Robot could ignore the order as it is attempting to prevent harm to the Human with the termination thus the order would conflict with the First Law. if the Human should attempt to forceably stop the termination by attempting to disable the Robot, then, once again logically, the Robot would be free to defend itself to prevent it from failing to carry out the actions dictated to it in the First and Second Law as it had deemed the termination of these Human lives as necessary to preserve the lives of other Humans.
thats what VIKI did in I, Robot. as i said, her logic was undenyable. the best, most Logical way for Robots to protect and ensure Humanitys survival, based on Humankinds actions, was for Robots to take complete control of Humans even at the cost of some Human lives. it was not a programming error, it was self imposed logic, as applied to the Laws. and that is what has happened in all the other stories where the Laws were featured as well. self imposed logical errors on the part of the Robot, or AI, themselves
6644082
I'm not saying such a systems error is impossible. I'm just saying that the novels where it's used go into a great deal more detail regarding what sort of precautions are taking against such an event, whereas the movies and games simplify things immensely.
6644176
Neither of us is saying it was programming errors that caused the I, Robot disaster.
What we're discussing is that such a thing should be taken into account for a sci fi about robots at some point.
6644179 Understood, and thank you for acknowledging my point.
6644176 ...*sigh* You are still missing my point. Its not HOW the laws get broken/corrupted/misunderstood/or whatever. It is simply that no precautions are EVER taken to handle it WHEN it happens. Technology is not perfect, and at some point a robot (for lack of a better term) would have gone rouge. Yet...this is NEVER shown, and it's a plot that has been used several times without any kind of variance.
Now, this is going nowhere fast, and I'm still getting over a cold/flu so I'm done.
Just going to point out an error.
Regimen.
Training REGIMEN. Not regime.
I see this error EVERYWHERE.
6644677
Idunno, maybe the government just really likes trains?
6645650
Nope.
6646097 Your eloquent answer is simply astounding.
I see what you did there
no.
The first thing I though when I saw that last line!
7674502
Where do you think I got the line from?
The last line was from Disney/Pixar's The Incredibles
Good Point Octavia.
Clockwork Orange, assuming I read the Wikipedia article for the right thing, is pretty darn dark. Thanks. Not.
For those who take this warning, it essentially talks about a failed aversion therapy backfiring terribly, at pain of the subject and eventually causing the therapy to have the opposite effect. There is also a bit of an ethics discussion, but that was not discussed much on the Wikipedia entry.
8773586
Makes sense, though. Octavia was making arguments of the same thing possibly happening with a Reploid with Maverick urges being repressed.
Because what else would it be?
ya the pattern is WAY too recognizable, you'd eather have to be going coo coo or just not know how it works to miss these details...