## Your Conception of Infinity is Almost Surely Wrong

*Content warning: this post contains bits in second person. Although the word “you” is used, it's used purely for humour and is in no way referencing the reader. I've thought a lot about different topics for blog posts, and I've finally decided to start with shitposting. I hope you enjoy.*

You may have heard something along the lines of “if there are infinite universes, there's one where (unrealistic expectation)”. I can with complete confidence say that's not true. In fact, mathematicians, whose literal job is to be pedantically rude, have a very rude way of putting this: it's *not* true, but it's *almost* true.

You see, the usual rules of the usual world don't apply when things spiral infinitely out of hand. Because there are multiple, conflicting sets of rules that are useful for infinities, we like to be specific in what set of rules we're using. This is also why no two things in mathematics have the same name. (That is a joke and also untrue.)

The infinite universes argument is one example of a very common set of rules for infinity being used when another set may also apply. We're honestly not sure if anything in the universe is infinite, but if it is, we don't know which rules the universe follows. So, we can't say that there really is a universe where you're better than me.

This kind of infinity is called *normal* infinity because it's what we generally think of as a reasonable kind of infinity. Surely, if we have infinitely many things, we must have *all* the things, right? Specifically, this phenomenon is proven in the *infinite monkey theorem*: if a monkey is typing on a keyboard in a *normally random* way, then if given infinite time, the monkey will surely type anything conceivable, including all of Shakespeare's works and the entire contents of this educational shitpost.

Alternatively phrased, the infinite monkey theorem says that a monkey typing randomly on a keyboard will *almost surely* type anything, and the almost is the part that says you're wrong.

You see, the key bit is that *normal* infinity we keep coming back to. It's the part that says that, if we have infinitely many things, we have *all* the things. We do not always have all the things. It's entirely possible for the monkey to just type the letter “A” and continue doing so for all of time and finally be relatable, *if* we do not accept a normal infinity. Surely, a *normal* monkey wouldn't have the dedication, but are you sure you have a normal monkey?

Yes, given any *finite* amount of letters typed by our monkey, as the number of letters gets larger, the chance of getting that relatable string of all “A” becomes less and less likely. You may have even taken a calculus course and think that because of this, the limit as that number of letters approaches infinity is not relatable, and hence 0% likely. But, do you believe that *no* monkey *ever* can be relatable, given an infinite amount of time?

Non-normal infinities come up all the time in maths, and it shows that we can't always assume that an infinity is normal. For example, one third in decimal has an infinite number of fractional digits, and none of them are anything but three. There's no almost there: they're definitely three.

An infinitely typing monkey does seem pretty contrived, and it is, but the maths still work out. Hopefully, you'll find this post almost satisfying.