I made a video of my Optimalverse story, sweetie.log. It's... not exactly a reading, not exactly an animation.
Sweetie Bot lies half-built on a lab bench. Her creators have uploaded, abandoning her and their physical bodies, along with so many other people. She's contacted by Princess Celestia, the AI that rules Equestria Online, who tries to persuade her to join them.
21 Minutes of pure DOS-like text with a few pictures sprinkled in between, but it never gets boring. That's quite a feat, good job!
Just one question: How can the bot detect itself/herself with the infrared sensor in the glass reflection? I assumed the sensor would simply detect the surface of the glass and nothing else. Or am I missing something?
6517305 A good question. The way I did it was to write a web page with some JavaScript to type out the lines. The "story" input file was a spreadsheet, like so:
Dur Time Command String
A 0.8 4.1 echo [ 4.094752] Adding 8302588k swap on /dev/sda2. Priority:-1 extents:1 across:8302588k SSFS
B 2 4.9 cursor
C 3.5 6.9 type [color=#ea80b0]Booting Sweetie Bot Linux 0.9 alpha1 (build 4041)...[/color]
A 0.8 10.4 echo Last activity 49 days, 13 hours, 11 minutes ago.
Each line has a start time, a duration (I used the spreadsheet to add up the times for me), a command and some more info. The commands were things like echo, type, space, replace, tweet, sound etc. The content had a very small number of BBCodes it would understand, such as color.
The A, B, C are preset lengths, ie A = 0.8 seconds, B = 2 seconds, etc. I tuned these until they felt about right.
The typing effect has some randomness built into it. The line duration is divided into time per character, and the character may appear at any point during that time. That gives a more natural typing effect than having them all appear at exactly the right time.
Even with all the subtle details, the total was only about 200 lines of JavaScript - much less than I'd write in a typical day in the office.
When it was done, I recorded the result using OBS Studio and fed it through premiere in order to add more sounds and effects like the inset picture. The video was actually just a set of still pictures, all made from a single composited still picture to which I added different lighting/colouring effects in Photoshop: one for darkness, one for green night vision and one for IR. Then in Premiere I added the grain and dropped it down to 10 fps for that low-res look.
6517813 It depends on the camera, but higher-frequency infrared behaves a lot like visible red light. She probably wouldn't be able to see much out the window, even if it wasn't cold and dark out there, but she can see a reflection in the glass.
6517305 Oh, and the voice samples were a friend. Recorded in a cupboard, so her friends wouldn't ask why she was randomly screaming into her phone. I added a basic "robot voice" effect (a short time, low decay reverb) in Audacity.
Watching this video was pretty spiffy. It's the only story I have seen that requires an understanding of Linux to make sense, but that makes it more enticing for me. I went and had a look at the Sweetie Bot project, but after all those videos, I am incredibly disappointed. Other than the fact that it looks like Sweetie Belle, this team is building just another homebrew robot. Granted, that itself is an expensive an interesting hobby requiring a great deal of technical knowledge, but when I think to myself "someone's building an actual Sweetie Bot!", my immediate thought after that for minimum effectiveness is: chatbot multiplied by Roomba. In other words, in my mind, what makes a "Sweetie Bot" as opposed to "a bot that looks like Sweetie Belle" is any reasonable attempt to mimic personality.
The builders of this bot are not advertising in their videos or convention demonstrations that they are in need of an AI team, and they really really should.
On a side note, In 2015 for a whopping $250, StarLily, My Magical Unicorn was released under the FurReal Friends product line, but it turns out this line of toys is made by Hasbro! I always wondered if Hasbro was going to make a robot plushie in the shape of a unicorn, why didn’t they theme it to their biggest product line, then to current, My Little Pony?
6600903 This is the most comical version of label slapping I've seen in the past decade. That is the same internal body as StarLily and even use some of the same sound effects.
Literal answer was "Была такая идейка. Пока некому заняться." - "yes, we had such idea but right now we have nopony to implement it" (not literal translation). Do you know any chatbot engine to try and install ? ( but please not this annoying wall-of-text bot I saw on #nouveau ...)
6960250 I do not know of any chatbots as I have never used them. Since you can contact the team, why not ask them to post on their sites that they are looking for help and advice for the project? The robot itself is starting to look like Sweetie, but without any illusion of autonomy, it’s just a sculpture.
I made a video of my Optimalverse story, sweetie.log. It's... not exactly a reading, not exactly an animation.
Wow!
Very well done. You have done a marvelous work.
Whoah! Feels!
So how did you get the video? Is that real video, a rendering, or something else?
Excellent work. I can tell this was truly a labor of love. Reading that letter broke me. Reading what came after broke me even more.
++impressed
I'm guessing that those footsteps mean that CelestAI sent a subvert to turn SweetieBot on?
21 Minutes of pure DOS-like text with a few pictures sprinkled in between, but it never gets boring. That's quite a feat, good job!
Just one question: How can the bot detect itself/herself with the infrared sensor in the glass reflection? I assumed the sensor would simply detect the surface of the glass and nothing else. Or am I missing something?
6517305
A good question. The way I did it was to write a web page with some JavaScript to type out the lines. The "story" input file was a spreadsheet, like so:
Each line has a start time, a duration (I used the spreadsheet to add up the times for me), a command and some more info. The commands were things like
echo
,type
,space
,replace
,tweet
,sound
etc. The content had a very small number of BBCodes it would understand, such ascolor
.The A, B, C are preset lengths, ie A = 0.8 seconds, B = 2 seconds, etc. I tuned these until they felt about right.
For example, the voting block was done like so:
Tweets actually had JSON embedded in the data cell:
The typing effect has some randomness built into it. The line duration is divided into time per character, and the character may appear at any point during that time. That gives a more natural typing effect than having them all appear at exactly the right time.
Even with all the subtle details, the total was only about 200 lines of JavaScript - much less than I'd write in a typical day in the office.
When it was done, I recorded the result using OBS Studio and fed it through premiere in order to add more sounds and effects like the inset picture. The video was actually just a set of still pictures, all made from a single composited still picture to which I added different lighting/colouring effects in Photoshop: one for darkness, one for green night vision and one for IR. Then in Premiere I added the grain and dropped it down to 10 fps for that low-res look.
6517813
It depends on the camera, but higher-frequency infrared behaves a lot like visible red light. She probably wouldn't be able to see much out the window, even if it wasn't cold and dark out there, but she can see a reflection in the glass.
6517305
Oh, and the voice samples were a friend. Recorded in a cupboard, so her friends wouldn't ask why she was randomly screaming into her phone. I added a basic "robot voice" effect (a short time, low decay reverb) in Audacity.
6518071
Ah, I see. Thanks!
6516694
I'm on vacation right now, but I can't wait to get back and watch this! The story always makes me cry.
Watching this video was pretty spiffy. It's the only story I have seen that requires an understanding of Linux to make sense, but that makes it more enticing for me. I went and had a look at the Sweetie Bot project, but after all those videos, I am incredibly disappointed. Other than the fact that it looks like Sweetie Belle, this team is building just another homebrew robot. Granted, that itself is an expensive an interesting hobby requiring a great deal of technical knowledge, but when I think to myself "someone's building an actual Sweetie Bot!", my immediate thought after that for minimum effectiveness is: chatbot multiplied by Roomba. In other words, in my mind, what makes a "Sweetie Bot" as opposed to "a bot that looks like Sweetie Belle" is any reasonable attempt to mimic personality.
The builders of this bot are not advertising in their videos or convention demonstrations that they are in need of an AI team, and they really really should.
On a side note, In 2015 for a whopping $250, StarLily, My Magical Unicorn was released under the FurReal Friends product line, but it turns out this line of toys is made by Hasbro! I always wondered if Hasbro was going to make a robot plushie in the shape of a unicorn, why didn’t they theme it to their biggest product line, then to current, My Little Pony?
6594871
They did make a robot Twilight Sparkle:
6600903
This is the most comical version of label slapping I've seen in the past decade. That is the same internal body as StarLily and even use some of the same sound effects.
6594871 (PeachClover)
I actually forwarded this idea to some Sweetie Bot team russian bronies -> https://tabun.everypony.ru/blog/irl-connection/189964.html#comment13008693
Literal answer was "Была такая идейка. Пока некому заняться." - "yes, we had such idea but right now we have nopony to implement it" (not literal translation). Do you know any chatbot engine to try and install ? ( but please not this annoying wall-of-text bot I saw on #nouveau ...)
6960250
I do not know of any chatbots as I have never used them. Since you can contact the team, why not ask them to post on their sites that they are looking for help and advice for the project? The robot itself is starting to look like Sweetie, but without any illusion of autonomy, it’s just a sculpture.