A laser beam propagates through a spherically symmetric medium. The refractive index varies with the distance from the centre of symmetry $r$ according to the equation: $$ \mu=\mu_0\frac{r}{r_0} $$ where $\mu_0=1$ and $r_0=0.3 m$ and $r_0\leq r\leq\infty$ and the beam's trajectory lies in a plane containing the centre of symmetry $C$. At a distance $r_1=1 m$, the beam makes an angle $\phi=30^\circ$ with $\vec{r_1}$(see figure). Find the minimum distance that the beam reaches from $C$
What I first did was apply $\mu(r) \sin(\theta)=\text{constant}$ and calculate it for the initial case of $[r_1,\phi]$ and for the final case, when $\theta=90^\circ$, giving an answer $r_f=2 m$ which seemed physically reasonable provided I didn't reach a point where $\theta$ becomes a critical angle.
However, the answer key reports it as $r_f=\frac{1}{\sqrt{2}} m$. Any other explanations I saw looked like they were forced to fit this answer. And if the inner regions are rarer anyway, how can the ray progress inside at all?
Please help