My Little Pony: Markov Chains are Magic! · 6:48am Feb 5th, 2016
If you've been paying attention to the /r/mylittlepony subreddit, you probably already know about this, but I made a random episode generator using markov chains. It originated in the bot I wrote for the /r/mylittlepony discord chat, but I wanted to make a client-side version so people can generate random nonsense to their heart's content. Have fun!
My Little Pony: Markov Chains are Magic
Stop being so horse famous! Share the fame with DisChat!
I'm down with this.
You made the pony clicker too didn't you?
3734276
Yes, I did, you can find both in my github profile if you're curious.
Like how Twilight talks about Dark magic of Friendship. Dash has a marriage with Wind Rider?!
Source transcripts could be pre-edited further: inline narrative parts assigned to characters get into sentences, they can probably be separated to their own named lines in braces. some strange braces and symbols placement, such as multiple sentences in a row without dots, braces type mismatch or Unicode braces that fail to render.
3734379
I copy pasted the transcripts from the wiki and then wrote a function to parse them. I wasn't willing to hand-edit all 100 something transcripts because that would've taken forever. Believe it or not, some of the more severe errors were already edited out.
I'm pleasantly reminded of RoboRosewater.
3734403 If MLP's text consisted of average words and not Pinkie-isms, then its syntax could be analyzed and then text generation could be adjusted according to both scene, talking actor and speech topics recognized from it... but it's not... besides, speech is generally harder to analyze than descriptions
If this isn't the season six opener I shall be very disappointed.
Dammit, I had shit I wanted to do today.
3734444
You could also just use a gigantic recurrent neural network, but i'm not sure there's a sufficient amount of text to train it on.
3734495 or, to build a memory array for bits of information of various types linking to each other using array of pointers pointing to them, so that words and other symbols would indirectly link to raw neural network weights, so that composition of the words would resolve to composition of images. Then, trying to randomly combine images would force a thousand of parallel copies of the network to recognize them and search again the links to their symbols and combine words to produce text. Inverse problems ftw.
3734513
... or you can just use a recurrent nueral network built for text processing.
3734495
You might consider interpolating it with a general language model filtered by words and phrases that appear in the show. That could provide more robustness without sacrificing MLP-ness.
Heh. ML penis.
I feel like this needs anotated episodes clipped and stiched together so that the subs match your story as close as possible. I would so want to watch that.
3734831
Here you go:
This would be more accurately named: "If James Joyce wrote My Little Pony screenplays."