So, here is the question: Assuming the 2D Cartesian system and the basis vectors $\hat{i}$ and $\hat{j}$, we have two cars A and B located at (0,6) and (-30,0) respectively. Car A starts moving with a constant velocity of -3 m/s $\hat{j}$ and Car B with 4 m/s $\hat{i}$. The question is to find the minimum separation between these two cars.
I understand there is a way we could solve this using calculus (specifically derivatives), but that way involves a lot of calculations. So my professor explained a way to solve this using relative motion, i.e assuming either body's frame and working out the problem in that frame. In here, particularly, if we assume to be in the frame of B, we would see that A has a velocity of $-(4 \hat{i}+3 \hat{j}) \frac{m}{s}$. If we draw out the line that the body A traverses in the frame of B, we get the line 3x-4y+24=0. Now the thing which the professor did was confusing to me, they said that the minimum separation between them is going to be the shortest distance of the point where car B was initially, to the line I mentioned above. What I do not get is that, why did they say that the shortest distance is from that point (-30,0) to the line? I mean it felt weird to me because B's origin (their own particular co-ordinate system's) is changing all over the time. So how did they conclude we have to calculate the distance from there? I am sorry if I could not make complete sense.