When light passes from an optically denser medium to an optically rarer medium, it bends away from the normal in accordance with the Snell's Law of refraction $\mu_1\sin i =\mu_2\sin r$ where $\mu_1$ and $\mu_2$ are the refractive indices of the two media, $\mu_1>\mu_2$, and $i$ and $r$ are the angles of incidence and refraction respectively.
When we increase $i$ gradually, $r$ also increases. Since, $r>i$ as $\mu_1>\mu_2$, $r$ becomes $90^\circ$ before $i$. The angle of incidence for which the angle of refraction is $90^\circ$ is called the critical angle $i_c$. From Snell's law we get, $\sin i_c =\mu_2/\mu_1$. This is how critical angle is explained in almost all sources. Beyond this angle it's said the Snell's law of refraction is no longer valid and refraction is not possible, only reflection or more precisely total internal reflection takes place.
I do not understand why we still consider the refractive index of the medium at grazing refracted ray ($r=90^\circ$) as $\mu_2$. How can we ascertain the speed of light (and hence the refractive index) in the interface separating two media? According to the derivations, the speed of light in the interface is same as that of the medium other than the one from which the ray emerged (here it's the rarer medium). But why do we choose it this way? Why can't it be the other way round?