//------------------------------// // Asymmetry (Horror) // Story: Parma Quentaron Sintë Undómëo // by Undome Tinwe //------------------------------// The concept of relative entropy was first discovered by two ponies, Kite Back and Leafblower, and is thus sometimes known as KL Divergence. Most first-year students at CSGU are surprised to learn that an idea so fundamental to their magical studies was developed by two pegasi. Not that they were racist or anything! It was just that, well, pegasi weren't exactly known for their theoretical mathematics, what with their outdoor predilections and their lack of telekinesis for the sheer amount of writing needed for the advanced stuff. And considering how KL Divergence is mostly used in analyzing communications channels, it's not unreasonable to think that it would've been created by unicorns trying to optimize their sending spells. But, if one were to ask a pegasus (after explaining what KL Divergence even was because, y'know, most of them are outside clearing clouds rather than studying in the library) if they thought it was odd that one of their own had developed a concept so theoretical, they'd laugh in your face and say that of course it took a pegasus to point out something completely obvious, because the unicorns were too busy with their noses in a book to look up at the sky.  In fact, when asked about the concept, Commander Smoke Trail famously replied "Why do you think the smart unicorns live on top of mountains? Somewhere in those horns of them is a lick of sense, sometimes." See, KL Divergence is a measure of the distance between two probability distributions. It's denoted as D(p||q), where we are measuring the distance from one probability distribution p to another probability distribution q. Like any distance metric, D(p||q) is equal to zero if and only if p and q are the same "point" in probability space, and if its value is large, then q is very far away from p. Pegasi, as creatures of the sky, find probability very natural to them. Weather is chaos, but even in chaos there are patterns, and pegasi live or die by playing the odds with these patterns. Navigating a storm is little more than rolling the dice and making sure you remembered to weight them properly beforehoof. Most unicorns accept this as the reason why KL Divergence was first developed by two pegasi. They're right that understanding this concept is a matter of life-and-death. One of the more unique properties of the KL Divergence as a distance metric is as follows: ∃p,q: D(p||q) ≠ D(q||p) That is, the distance of p from q according to KL Divergence is not always the same as the distance of q from p. This fact takes students at CSGU years to wrap their heads around and, even after graduating, most of them just write it off as an unintuitive quirk of a very useful measure. Sure, it's not actually possible for distance to be asymmetric, but if the equations work, then there's no harm in assuming that's how it is. It's the same kind of willful suspension of disbelief needed to survive Quantum Physics. But the pegasi know that it's perfectly possible for distances to change depending on how you measure them. If the unicorns would take their eyes off their books and look up, then they'd see, among the intricate patterns of wind and water in the sky, that there are shapes that look like clouds molded in the shape of creatures, dogs and snakes and octopi and stingrays dancing about high above the mountaintops. They are not clouds. And the pegasi choose to stay in the skies, to live in the clouds that float higher than Canterlot itself, because they know that distances can be deceiving. When the pegasi fly above their fellow wingless ponies, they are close to those things that swim through the air, that have wandered the heavens since the beginning of Time. And those things are far away from them. And when their colleagues in their schools choose to stay inside, in their homes situated on the ground, they are far away from those that seek out flesh and blood and life. But the clouds-that-are-not-clouds are close to them, so close to their prey in ways that they could never understand and that they won't ever listen to the pegasi about when they try to warn them of the dangers that lurk above. Because as the limit of D(p||q) approaches zero, p and q must take on the same values. And every year, there are more clouds in the sky, shaped like dogs and snakes and octopi and stingrays. And ponies.