This is the final question from IB Math HL Paper 1 Nov 2014 TZ0.
Show that: $$(1+ i\tan\theta)^n + (1-i\tan\theta)^n = \frac{2\cos n\theta}{\cos^n\theta}$$
I tried to solve the problem as follows:
We know: $$r=\sqrt{x^2+y^2}$$
I started off by converting the left side to polar form in order to use De Moivre's Theorem.
So we get:
$$r=\sqrt{\tan^2\theta+1}$$
$$r=\sqrt{\sec^2\theta}$$
$$r=|\sec\theta|$$
Thus:
$$(\frac{1}{|(cos\theta)|}(\cos\theta+i\sin\theta))^n + (\frac{1}{|(\cos\theta)|}(\cos\theta-i\sin\theta))^n$$
By De Moivre's Theorem:
$$(\frac{1}{|(\cos\theta)|^n}(\cos n\theta+i\sin n\theta)) + (\frac{1}{|(\cos\theta)|^n}(\cos n\theta-i\sin n\theta))$$
Simplifying we get:
$$\frac{2\cos n\theta}{|\cos\theta|^n}$$
Which is the right hand side if we ignore the absolute value. But why are we supposed to ignore the absolute value as $\theta$ has no restrictions except for $\cos\theta\neq0$?
I would think my answer with the absolute value is correct, but testing with $\theta = \pi$ and $n=3$, we get $2 = -2$, which is not true. Testing without the absolute bracket we get $2 = 2$, which holds true.
So we know my answer does not work, but my question is why should we simply ignore the absolute value on $\sec\theta$?