The intended output of many lasers in laser scanning is Gaussian. At distance $z$ from the waist, the radius of a Gaussian beam is calculated as $$w(z) = w_0 \sqrt{1+(z/z_R)^2},$$ where $w_0$ is the waist radius, and $z_R = \pi w_0^2/\lambda$ is the Rayleigh range, depending on the waist $w_0$ and the wavelength $\lambda$. When distance $z$ is considerably larger than $z_R$, the radius $w$ grows approximately linearly, $$w(z) \approx \theta z,$$ where $$\theta = \frac{\lambda}{\pi w_0}$$ is the divergence angle.
I am modeling a laser beam from a laser scanning device as a Gaussian beam. I am not sure how to decide the waist radius $w_0$. For example, it is given that laser beam footprint at exit is $5$ mm and the divergence is reported to be 0.5 mrad. Wavelength is 1500 nm.
Then we may calculate $$w_0 = \frac{\lambda}{ \pi \theta} = \frac{1500 nm}{\pi \times 0.5 \times 10^{-3}} = 0.0009549... m$$ and $$z_R = \pi w_0^2/\lambda = \frac{\pi \times (0.0009549... m)^2}{1500 nm}= 1.909... m.$$
But then the radius is equal to the output radius $5mm/2 = 2.5 mm$ at distance $z \approx 4.6209 m$, which is insane. The device is surely not 5 meters long. So have I understood the parameters of Gaussian beam incorrectly, or is there some optical tricks happening inside the laser device? Or is it so that the Gaussian model is accurate only for the Gaussian shape, not for the radius calculation?