The key thing is to focus on only those parts in the formula in the third line that contain $\theta$ as a free variable, and to disregard everything that does not depend on $\theta$ any more - because the latter parts are just a multiplicative constant that ensures your probability integrates to one. Then we get an expression of $P(\theta|x)$ as proportional to something:
$$ P(\theta|x) \propto \theta^{x+\alpha-1}(1-\theta)^{n-x+\beta-1}. $$
(Note that the integral in the denominator integrates $\theta$ out, so it again doesn't depend on $\theta$ any more.)
And now we compare this to the PDF of the beta distribution, again up to multiplicative constants - and we see that $P(\theta|x)$ is proportional to the PDF of a $\text{Beta}(x+\alpha,n-x+\beta)$.
So we now have your posterior density $P(\theta|x)$ and the $\text{Beta}(x+\alpha,n-x+\beta)$ density. Both are probability densities, so they both integrate to one. And we just found that they are proportional to each other. But two functions that are proportional to each other and integrate to the same value must be equal.
This way of thinking and arguing is extremely common in Bayesian statistics, especially when looking at conjugates. Think about it and get familiar with it, you will definitely see it again.