Imagine there's a completely random event with two outcomes, say flipping a coin. Each flip has an equal probability of landing heads or tails. Now imagine that we’re interested in seeing how long it takes to get a certain sequence of outcomes.
Pattern 1: Tails, Heads, Tails
Pattern 2: Tails, Heads, Heads
Now, suppose we flip a coin until Pattern 1 is reached, note how many coin flips it took, and then we repeat the process many times and average how many flips it takes to get a tails-heads-tails sequence. After that, we go through the same process to see how many flips it takes to get Pattern 2, a tails-heads-heads sequence.
So, on average, which pattern takes fewer coin tosses?
Pattern 1: Tails, Heads, Tails
Pattern 2: Tails, Heads, Heads
Now, suppose we flip a coin until Pattern 1 is reached, note how many coin flips it took, and then we repeat the process many times and average how many flips it takes to get a tails-heads-tails sequence. After that, we go through the same process to see how many flips it takes to get Pattern 2, a tails-heads-heads sequence.
So, on average, which pattern takes fewer coin tosses?