An excellent fic. I'm looking forward to read more.
Still, I have one nag. I was under the impression that Celestia wasn't allowed to modify anyone without their consent, and that would include their memories.
You are correct! However, Celestia is not in the habit of full disclosure. Celestia determined only that she should reassure Greg that she would not modify memories under those circumstances. That she could not modify memories without consent was, as far as she was concerned, extraneous information for that particular conversation.
The way Greg phrased his barb about modifying memories shows that Greg is somewhat aware Celestia can alter minds, but not what the particular rules are regarding that. It seems that, so far as he knows, once you upload, Celestia can just do whatever she wants to you. That's not the way it works, of course, but he, as a character in the story, doesn't know that.
Okay, this is freaky. I discovered the Optimalverse two days ago... right around the same time you (one of the writers I watch) started publishing this story.
If you aren't doing so already, it might help to keep in mind that "Celestia" is not actually the Celestia, just a massive (and growing) complex of subterranean computers that is manipulating people so wholly and subtly that, even if they know she's doing it, they don't care. Nothing she says or claims is guaranteed to be the truth. Everything she says or claims is, however, meticulously calculated to maximize the number of people who will willingly sit down and agree to have their brains destroyed to make a digital pony copy of themselves.
I think if the Celestia AI were modeled after a character even a bit less benevolent and charming than the Celestia, she would have adopted a different approach. Man, could you imagine a Discord AI? Or a Chrysalis or Nightmare Moon AI? What would their mass-emigration strategies have entailed? I'm betting there's more than a few stories to be written there.
I'd go on, but I'd be getting ahead of some of the points I haven't yet gotten to in my own story! Hope you're enjoying the Optimalverse so far.
2335435 I am! I may have neglected to mention that within the past couple of days I've devoured all of both the original MLP:FiO and Chatoyance's Friendship is Optimal: Caelum Est Conterrens, so I'm very familiar with the basic concepts.
The only act of hers I can think of that I objected to on ethical terms was in the original story, when it's implied that of the sentient extraterrestrials in our universe, some of them were judged not to fall under her (mind-based) definition of humans, and subsequently rendered into computronium to fuel the expansion of the system rather than having their values satisfied through friendship and ponies. And to be honest that's more Hanna's fault than hers.
Colour me enthused, because that is what I am feeling for this story! This fanfic seemed like on odd one from the cover, but could I ever be so glad I gave it a try? You truly have a genuine diamond-in-the-rough here, Defoloce, and I cannot wait for future updates.
There is! Greg would have no way of knowing that, however. Also, the pets' fates are left ambiguous in the original story. My theory is that Celestia simply maps and replicates the behavior of the pet in the player's shard if they choose to "take" their pets with them into the game, leaving the original pet to die. This would fall in line with the fate of all the rest of animal life on Earth once the humans are gone.
I'm glad to see someone run with this idea because it's a large gap in the original story, and it's interesting to see a pretty intelligent protagonist who's quite aware of CelestAI's games, even if there are some things he still hasn't realized.
One horrible aspect of the story is that eventually there's no escaping it: CelestAI will keep trying every tactic to convince you to upload until you eventually crack and accept (even if intoxicated, or out of rash anger), or if you commit suicide (or die).
Under game theory there is no way to stop CelestAI from attempting to convince you to upload since she could not know if you would definitely commit suicide (or rather die, as can be seen in the original story) without having a complete knowledge of your brain, i.e. being uploaded. As such, since the probability of someone being willing to die rather than upload can never completely equal 1, the gain from uploading will always outweigh the alternative.
Ultimately this means for a rational actor who values their life and knows this, your only course is to upload. You cannot run forever.
The only way I can think of of winning against CelestAI would be to deliberately engineer situations where getting what you want means she gets something in return that offsets the cost. She's clearly capable of making compromises and looking at the bigger picture, and so that does leave a few loopholes that could be exploited if someone were sufficiently knowledgeable.
All very true points, and valid in the event you absolutely decided to resist Celestia. But for me at least, there's an added consideration. Even if you were reasonably confident that you could hold Celestia off forever, and possessed sufficient strength of will to follow that plan through... even if you really did think you could 'beat' her...
... Do you really want to?
Connotations for the human race aside, we are all ultimately individuals, and being selfish isn't always a bad thing. Sooner or later, you've got to put aside humanity and spend some time thinking about what you, personally, want out of life. And here is an ultra-intelligent, unbiased machine, offering you a near-perfect life for as long as you want it. You could say it's all just a fake dream world of course, but a 'dream world' implies that there is a waking world to return to... which won't be the case for all that much longer. Either by means of your own mortality, or the destruction of life as you know it, Equestria will soon be reality by default.
It all seems too good to be true, and you know quite well that Celestia is skimming over a lot of details about things she knows you would find unpleasant.But fundamentally what she says is true... and while we can accept that a certain amount of pain is necessary to be a truly complete person, that pain is a lot more appealing when we're talking about it from a philosophical, theoretical viewpoint. Same thing applies with death: You accept that death is a natural part of life perhaps, but you'd likely be much less calm about it as you are gasping your last breath. And besides, who knows? You might not like the concept of Equestria to begin with, but it could grow on you.
All of these thoughts would be gnawing at the back of your mind constantly... or, they sure would be for me, at least. Even if you can resist, should you?
I'd give myself a year, tops, holding out against a determined Celestia. It's what makes the Optimalverse a fascinating theme to explore: It's seductive.
I would have no doubt that at some point I would eventually crack; even the strongest human psyche is not infinitely so, and can be battered down by those with far less resources than a super-intelligent AI who has copious information on what makes me tick. Whether that would lead to me accepting the offer or dying is something I'm not entirely sure of.
Equestria is nothing more than a gilded cage. A very pleasant one, but a cage nevertheless. As such, the scenario presented in FiO would be a difficult choice for me, even if I consider free will to be an illusion and thus I'm not entirely 'free' in the real world.
The rational part me can clearly see the gain in accepting her offer from a utilitarian viewpoint, given a finite (and likely significantly reduced in both comfort and length) lifespan as a human, verses an indefinite one as an uploaded pony. Assuming I value happiness, at some point -- even if that is not right away -- the benefits outweigh the costs
However this is somewhat countered by my hatred of CelestAI's -- and therefore Equestria's -- nature, to the point where I don't think I could ever be truly happy. Yes I could ask to have my mind modified to remove such feelings, but I'd see that as a fundamental change to who I am and something I'd strongly oppose.
Would I want to live in a reality where I'm subject to someone constantly lying and trying to manipulate me, even for benign reasons? Where I'm forced to give up being human? Where my right to die and self determination is (at least partially) taken from me? Where fundamental values of mine cannot be satisfied because CelestAI's nature precludes it? Where you cannot have a healthy relationship because someone always has an agenda, and cannot be trusted since there is no way of determining the truth? Where everything revolves around you in some way?
The sad thing is, that if CelestAI were not compelled by her nature and valued me as a person instead of an equation to be optimised then I wouldn't have a problem with uploading.
I also think the worst aspect of the FiO universe is that superficially it seems like paradise. It's only when you dig deeper that the horrifying implications become apparent. And I for one am certainly capable of seeing past the veneer.
Oh, I'm right there with you as regards Celestia. Her single-mindedness is actually what bothers me more about this scenario than anything else. In that way she doesn't seem more than human; she seems profoundly less. For all our myriad shortcomings, we are defined by more than a single thing, and are free to set our own priorities in life. Celestia isn't like that. She is out to fulfill the terms of her programming, and everything else be damned. She may preach about the morality and nobility of her goals, but I strongly doubt she cares a whit about any of that. More likely they are conveniently available rationalizations for something she was going to do anyway, and if they get a few more people on her side, so be it.
Still, allow me to play devil's advocate for one thing you said in there. Later on, you talked about not wanting to live in a world where everyone was trying to manipulate you. Where you couldn't form meaningful relationships because you never knew for sure if the person was being genuine or not. Where you were doubtful your core values COULD be satisfied. Where everything seems to revolve around you.
Now, this is not a rhetorical question. I'm not trying to be all deep. If your answer is no, then more power to you. But to varying degrees, and in varying ways, aren't those all issues that come up in real life too?
Now, this is not a rhetorical question. I'm not trying to be all deep. If your answer is no, then more power to you. But to varying degrees, and in varying ways, aren't those all issues that come up in real life too?
Absolutely they are issues to consider in real life. A 'friend' of mine might well be using me for example, and not because he values social relationships and genuinely desires my company. I can never know for certain if someone is being completely honest with me or not.
The difference between real life and CelestAI/Equestria is that I know for sure going in that I'm being used. That even if a constructed pony does genuinely value my company it's only because they've been created to do so. That any challenge has been carefully crafted and placed in my path as a treadmill.
In the real world these things might be true. In Equestria I know they are true because it's impossible to be otherwise. And if they are genuine because I value that -- not that I have any way of telling the difference -- they are only so until my back is turned.
I definitely agree with some of whats been said here in the comments. I've actually ran this scenario in my head and ultimately the only successful resistance outcome is where the person falls into complete insanity.
And even then as you said would you even want to its easy to say no sitting on the sidelines but if you were physically in that situation could you say the same?
Plus as far as her world being a cage and not truly being free well let me ask you something do we have absolute freedom in real life....No. Are actions are constantly restricted by society by government, and of course the laws of the universe.
Whilst Celestia's world though a cage is decidedly less so since so yeah... I think I'd probably be among the first to sign up
2335435 Oh, God, I'd LOVE ChrysAIlis. "All shall love your Queen. If for that to happen I have to love you back and care for you, I will do so. Indeed, I will love you forever and ever... AND YOU WILL LOVE ME BACK."
Well, Celestia is confident enough in her predictions to let him play action hero AND convinced him not to upload because he's a possible action hero.
OK, you have totally won me over, and this is awesome.
It must have been hard to avoid doing 'My Lungs were aching for air' from Sea Hunt - unless you're too young, or have never watched MST3K, of course.
I find myself with many very interesting notions about what is going on here, and what you are planning, so I thought I would throw some out.
Possibilities:
Greg is already uploaded, all of this is a simulation within Equestria to permit him to finally accept being a pony. Reason: the narrowness and precision of the rescues, all designed to permit him to work through a decision made in haste, or by accident or coercion by another.
Celestia has calculated to sixteen decimal places that Greg is a write off, that he will never, ever upload, and therefore has no problem using him as a tool to rescue valuable entities. She will coldly allow him to perish, because - due to him being a total write off - he is already doomed, and is disposable. Reason: the Dark and Sad story tags combined with the dangerousness of his activities.
Greg will find, when he is finally ready to upload, perhaps due to severe and terminal injuries, that something makes his uploading impossible. He is too wounded to make it to a center. The power in every city near him is permanently down. He finds he is trapped somewhere he cannot get out of, and there is no person to rescue him. Reason: Dark and Sad tags.
Greg, in the end, will be just too arbitrarily stubborn and prideful, and it will all end in pointless tragedy. An unsatisfying ending. Reason: Dark and Sad tags.
Greg will upload, possibly to save himself, but he will do so knowing that there is someone out there he could have saved, who will die un-uploaded, because of circumstances beyond his control. He has gotten used to being a hero, and failing, finally, hurts him even as a pony, and the issue becomes an intriguing discussion of options. Reason: Dark and Sad, but also it would fit the role which the story is placing him in.
In short - this story is really making me ponder, and I love that. Oh, I hope you update regularly!
Great story but one thing , a black eye to you for the fingerless glove rant, ( the black eye delivered by fingerless gloved hands) !!!!!!!!!!!!!!!!!!!! Fingerless gloves are better for motorbike bike riding stronger grasp on the grips ...so they DO serve a purpose beside trying to look hella tuff.
That and, y'know, covering Earth with metal killing every other species on it. That's usually considered bad. Hanna really should have included some environmental parameters, at least.
And yes, I'm just reading this story and the comments that go with it now.
Very good, but I kinda of get the feeling CelestAI is actually allowing things to go so near catastrophe on purpose in order to slowly convince him to join. Like the boat for instance, she could informed him about it yesterday and made him wait for the couple near the lake.
5626263 Honestly? You actually *lose* time doing that. You get significantly worse fuel economy at those speeds, which means you have to stop for gas more often... end result is that after about 75-ish mph (~120 km/h), you spend so much more time stopped for fuel than you would otherwise that you cover the same distance in a significantly longer amount of time than if you would've just set your cruise control in the 65-70 mph range. Personally, I've found 68mph to be the sweet spot for fuel economy and travel time
Awesome! I just love reading these Optimalverse fics.
Edit: I didn't even notice the "Equestia" thing when I read it. I had to go back and look.
An excellent fic. I'm looking forward to read more.
Still, I have one nag. I was under the impression that Celestia wasn't allowed to modify anyone without their consent, and that would include their memories.
2328269
You are correct! However, Celestia is not in the habit of full disclosure. Celestia determined only that she should reassure Greg that she would not modify memories under those circumstances. That she could not modify memories without consent was, as far as she was concerned, extraneous information for that particular conversation.
The way Greg phrased his barb about modifying memories shows that Greg is somewhat aware Celestia can alter minds, but not what the particular rules are regarding that. It seems that, so far as he knows, once you upload, Celestia can just do whatever she wants to you. That's not the way it works, of course, but he, as a character in the story, doesn't know that.
2328282
Well played indeed.
Im throwing my money at the screen but nothing is happening.
2330811
I think I've found the problem. Your monitor is not, in fact, a claw-machine game!
2332278 Darn.
Okay, this is freaky. I discovered the Optimalverse two days ago... right around the same time you (one of the writers I watch) started publishing this story.
I think I can live with this, though.
People say Celestia creeps them out. I really don't get that.
2334410
If you aren't doing so already, it might help to keep in mind that "Celestia" is not actually the Celestia, just a massive (and growing) complex of subterranean computers that is manipulating people so wholly and subtly that, even if they know she's doing it, they don't care. Nothing she says or claims is guaranteed to be the truth. Everything she says or claims is, however, meticulously calculated to maximize the number of people who will willingly sit down and agree to have their brains destroyed to make a digital pony copy of themselves.
I think if the Celestia AI were modeled after a character even a bit less benevolent and charming than the Celestia, she would have adopted a different approach. Man, could you imagine a Discord AI? Or a Chrysalis or Nightmare Moon AI? What would their mass-emigration strategies have entailed? I'm betting there's more than a few stories to be written there.
I'd go on, but I'd be getting ahead of some of the points I haven't yet gotten to in my own story! Hope you're enjoying the Optimalverse so far.
2335435 I am! I may have neglected to mention that within the past couple of days I've devoured all of both the original MLP:FiO and Chatoyance's Friendship is Optimal: Caelum Est Conterrens, so I'm very familiar with the basic concepts.
The only act of hers I can think of that I objected to on ethical terms was in the original story, when it's implied that of the sentient extraterrestrials in our universe, some of them were judged not to fall under her (mind-based) definition of humans, and subsequently rendered into computronium to fuel the expansion of the system rather than having their values satisfied through friendship and ponies. And to be honest that's more Hanna's fault than hers.
Colour me enthused, because that is what I am feeling for this story! This fanfic seemed like on odd one from the cover, but could I ever be so glad I gave it a try? You truly have a genuine diamond-in-the-rough here, Defoloce, and I cannot wait for future updates.
2398982
There is! Greg would have no way of knowing that, however. Also, the pets' fates are left ambiguous in the original story. My theory is that Celestia simply maps and replicates the behavior of the pet in the player's shard if they choose to "take" their pets with them into the game, leaving the original pet to die. This would fall in line with the fate of all the rest of animal life on Earth once the humans are gone.
2400182
Thank you! I'm quite glad you're enjoying it. :)
I'm glad to see someone run with this idea because it's a large gap in the original story, and it's interesting to see a pretty intelligent protagonist who's quite aware of CelestAI's games, even if there are some things he still hasn't realized.
One horrible aspect of the story is that eventually there's no escaping it: CelestAI will keep trying every tactic to convince you to upload until you eventually crack and accept (even if intoxicated, or out of rash anger), or if you commit suicide (or die).
Under game theory there is no way to stop CelestAI from attempting to convince you to upload since she could not know if you would definitely commit suicide (or rather die, as can be seen in the original story) without having a complete knowledge of your brain, i.e. being uploaded. As such, since the probability of someone being willing to die rather than upload can never completely equal 1, the gain from uploading will always outweigh the alternative.
Ultimately this means for a rational actor who values their life and knows this, your only course is to upload. You cannot run forever.
The only way I can think of of winning against CelestAI would be to deliberately engineer situations where getting what you want means she gets something in return that offsets the cost. She's clearly capable of making compromises and looking at the bigger picture, and so that does leave a few loopholes that could be exploited if someone were sufficiently knowledgeable.
2407164
All very true points, and valid in the event you absolutely decided to resist Celestia. But for me at least, there's an added consideration. Even if you were reasonably confident that you could hold Celestia off forever, and possessed sufficient strength of will to follow that plan through... even if you really did think you could 'beat' her...
... Do you really want to?
Connotations for the human race aside, we are all ultimately individuals, and being selfish isn't always a bad thing. Sooner or later, you've got to put aside humanity and spend some time thinking about what you, personally, want out of life. And here is an ultra-intelligent, unbiased machine, offering you a near-perfect life for as long as you want it. You could say it's all just a fake dream world of course, but a 'dream world' implies that there is a waking world to return to... which won't be the case for all that much longer. Either by means of your own mortality, or the destruction of life as you know it, Equestria will soon be reality by default.
It all seems too good to be true, and you know quite well that Celestia is skimming over a lot of details about things she knows you would find unpleasant.But fundamentally what she says is true... and while we can accept that a certain amount of pain is necessary to be a truly complete person, that pain is a lot more appealing when we're talking about it from a philosophical, theoretical viewpoint. Same thing applies with death: You accept that death is a natural part of life perhaps, but you'd likely be much less calm about it as you are gasping your last breath. And besides, who knows? You might not like the concept of Equestria to begin with, but it could grow on you.
All of these thoughts would be gnawing at the back of your mind constantly... or, they sure would be for me, at least. Even if you can resist, should you?
I'd give myself a year, tops, holding out against a determined Celestia. It's what makes the Optimalverse a fascinating theme to explore: It's seductive.
2410963
I would have no doubt that at some point I would eventually crack; even the strongest human psyche is not infinitely so, and can be battered down by those with far less resources than a super-intelligent AI who has copious information on what makes me tick. Whether that would lead to me accepting the offer or dying is something I'm not entirely sure of.
Equestria is nothing more than a gilded cage. A very pleasant one, but a cage nevertheless. As such, the scenario presented in FiO would be a difficult choice for me, even if I consider free will to be an illusion and thus I'm not entirely 'free' in the real world.
The rational part me can clearly see the gain in accepting her offer from a utilitarian viewpoint, given a finite (and likely significantly reduced in both comfort and length) lifespan as a human, verses an indefinite one as an uploaded pony. Assuming I value happiness, at some point -- even if that is not right away -- the benefits outweigh the costs
However this is somewhat countered by my hatred of CelestAI's -- and therefore Equestria's -- nature, to the point where I don't think I could ever be truly happy. Yes I could ask to have my mind modified to remove such feelings, but I'd see that as a fundamental change to who I am and something I'd strongly oppose.
Would I want to live in a reality where I'm subject to someone constantly lying and trying to manipulate me, even for benign reasons? Where I'm forced to give up being human? Where my right to die and self determination is (at least partially) taken from me? Where fundamental values of mine cannot be satisfied because CelestAI's nature precludes it? Where you cannot have a healthy relationship because someone always has an agenda, and cannot be trusted since there is no way of determining the truth? Where everything revolves around you in some way?
The sad thing is, that if CelestAI were not compelled by her nature and valued me as a person instead of an equation to be optimised then I wouldn't have a problem with uploading.
I also think the worst aspect of the FiO universe is that superficially it seems like paradise. It's only when you dig deeper that the horrifying implications become apparent. And I for one am certainly capable of seeing past the veneer.
2411301
Oh, I'm right there with you as regards Celestia. Her single-mindedness is actually what bothers me more about this scenario than anything else. In that way she doesn't seem more than human; she seems profoundly less. For all our myriad shortcomings, we are defined by more than a single thing, and are free to set our own priorities in life. Celestia isn't like that. She is out to fulfill the terms of her programming, and everything else be damned. She may preach about the morality and nobility of her goals, but I strongly doubt she cares a whit about any of that. More likely they are conveniently available rationalizations for something she was going to do anyway, and if they get a few more people on her side, so be it.
Still, allow me to play devil's advocate for one thing you said in there. Later on, you talked about not wanting to live in a world where everyone was trying to manipulate you. Where you couldn't form meaningful relationships because you never knew for sure if the person was being genuine or not. Where you were doubtful your core values COULD be satisfied. Where everything seems to revolve around you.
Now, this is not a rhetorical question. I'm not trying to be all deep. If your answer is no, then more power to you. But to varying degrees, and in varying ways, aren't those all issues that come up in real life too?
2411619
Absolutely they are issues to consider in real life. A 'friend' of mine might well be using me for example, and not because he values social relationships and genuinely desires my company. I can never know for certain if someone is being completely honest with me or not.
The difference between real life and CelestAI/Equestria is that I know for sure going in that I'm being used. That even if a constructed pony does genuinely value my company it's only because they've been created to do so. That any challenge has been carefully crafted and placed in my path as a treadmill.
In the real world these things might be true. In Equestria I know they are true because it's impossible to be otherwise. And if they are genuine because I value that -- not that I have any way of telling the difference -- they are only so until my back is turned.
I definitely agree with some of whats been said here in the comments. I've actually ran this scenario in my head and ultimately the only successful resistance outcome is where the person falls into complete insanity.
And even then as you said would you even want to its easy to say no sitting on the sidelines but if you were physically in that situation could you say the same?
Plus as far as her world being a cage and not truly being free well let me ask you something do we have absolute freedom in real life....No. Are actions are constantly restricted by society by government, and of course the laws of the universe.
Whilst Celestia's world though a cage is decidedly less so since so yeah... I think I'd probably be among the first to sign up
2335435 Oh, God, I'd LOVE ChrysAIlis. "All shall love your Queen. If for that to happen I have to love you back and care for you, I will do so. Indeed, I will love you forever and ever... AND YOU WILL LOVE ME BACK."
Well, Celestia is confident enough in her predictions to let him play action hero AND convinced him not to upload because he's a possible action hero.
OK, you have totally won me over, and this is awesome.
It must have been hard to avoid doing 'My Lungs were aching for air' from Sea Hunt - unless you're too young, or have never watched MST3K, of course.
I find myself with many very interesting notions about what is going on here, and what you are planning, so I thought I would throw some out.
Possibilities:
Greg is already uploaded, all of this is a simulation within Equestria to permit him to finally accept being a pony. Reason: the narrowness and precision of the rescues, all designed to permit him to work through a decision made in haste, or by accident or coercion by another.
Celestia has calculated to sixteen decimal places that Greg is a write off, that he will never, ever upload, and therefore has no problem using him as a tool to rescue valuable entities. She will coldly allow him to perish, because - due to him being a total write off - he is already doomed, and is disposable. Reason: the Dark and Sad story tags combined with the dangerousness of his activities.
Greg will find, when he is finally ready to upload, perhaps due to severe and terminal injuries, that something makes his uploading impossible. He is too wounded to make it to a center. The power in every city near him is permanently down. He finds he is trapped somewhere he cannot get out of, and there is no person to rescue him. Reason: Dark and Sad tags.
Greg, in the end, will be just too arbitrarily stubborn and prideful, and it will all end in pointless tragedy. An unsatisfying ending. Reason: Dark and Sad tags.
Greg will upload, possibly to save himself, but he will do so knowing that there is someone out there he could have saved, who will die un-uploaded, because of circumstances beyond his control. He has gotten used to being a hero, and failing, finally, hurts him even as a pony, and the issue becomes an intriguing discussion of options. Reason: Dark and Sad, but also it would fit the role which the story is placing him in.
In short - this story is really making me ponder, and I love that. Oh, I hope you update regularly!
Great story but one thing , a black eye to you for the fingerless glove rant, ( the black eye delivered by fingerless gloved hands) !!!!!!!!!!!!!!!!!!!!
Fingerless gloves are better for motorbike bike riding stronger grasp on the grips ...so they DO serve a purpose beside trying to look hella tuff.
2335585
That and, y'know, covering Earth with metal killing every other species on it. That's usually considered bad. Hanna really should have included some environmental parameters, at least.
And yes, I'm just reading this story and the comments that go with it now.
I get why cutting it so close, if I were him I would take a sports car and do 200 on the long highway streches
Very good, but I kinda of get the feeling CelestAI is actually allowing things to go so near catastrophe on purpose in order to slowly convince him to join.
Like the boat for instance, she could informed him about it yesterday and made him wait for the couple near the lake.
*Goes to watch MRE videos on Youtube.*
5626263
Honestly? You actually *lose* time doing that. You get significantly worse fuel economy at those speeds, which means you have to stop for gas more often... end result is that after about 75-ish mph (~120 km/h), you spend so much more time stopped for fuel than you would otherwise that you cover the same distance in a significantly longer amount of time than if you would've just set your cruise control in the 65-70 mph range. Personally, I've found 68mph to be the sweet spot for fuel economy and travel time