\begin{equation} \mathcal{W}_\epsilon(\alpha, \beta) = \min_{\pi\in \Pi(\alpha\beta)} \int c(x,y) \mathrm{d}\pi(x,y) + \epsilon H(\pi \| \alpha \otimes \beta) \end{equation} Cuturi (2013) introduced the entropy-regularized Wasserstein distance, or Sinkhorn distance, shown above, where $\epsilon $ is the regularization parameter and $H(\pi \| \alpha \otimes \beta)$ is the relative entropy, or KL-divergence, between the transport plan and the marginal probabilities.
But I have seen the $H(\cdot)$ term shown in two different ways, one with entropy and the other with relative entropy:
\begin{align} H(\pi) &= \int \pi(x,y) \ln \pi(x,y) \\ H(\pi \| \alpha \otimes \beta) &= \int \ln \left(\frac{\mathrm{d}\pi (x,y)}{\mathrm{d}\alpha(x) \mathrm{d}\beta(y) } \right) \mathrm{d}\pi (x,y) \end{align}
How are the last two lines equal or connected to each other? Obviously they're not the same, so why are there two different versions running around?