I came up with this question after watching a Numberphile video on the number of riffle shuffles required to randomize a deck. One question posed in the video was "How many one-card riffles are needed to randomize a deck?" The answer to this question was roughly 236 shuffles:
\begin{equation}
\sum_{k = 1}^{52} \frac{52}{i} = \frac{52}{1} + \frac{52}{2} + \frac{52}{3} + . . . + \frac{52}{52} \approx 236
\end{equation}
By taking the top card off the deck and poking it somewhere in the deck, it has a $\frac 1 {52}$ probability of going under the last card, say the king of diamonds. By repeating this procedure, we will slowly move the king of diamonds to the top of the deck, and when the king of diamonds is shuffled, the deck is now random. For each card moved under the king of diamonds, the probability is increased by $\frac 1 {52}$, resulting in the summation we have.
Now, for the question of a two-card riffle, I am having trouble understanding the casework behind it. There are now several ways for the last card to move up: either one of the cards being riffled goes under the last card, both go under, or neither go under. For both to go under, the probability is $\frac 1 {52} * \frac 2 {52}$. For one of the cards to go under, is it $2 * \frac 1 {52} * \frac {50} {52}$? What would the summation look like?
Note: A two-card riffle would involve taking the top two cards of the deck and reinserting each of the two cards somewhere in the deck at random (independently of each other).