I've recently picked up a math book I haven't read since college (highly recommended reading by the way!). I was reviewing multi-dimentional derivatives and such, and I stumbled upon a problem I've been trying to solve for two days, and I can't get it out of my head, so please help me out! = )
Problem (from memory):
There is a rabbit that runs in a perfect circle of radius $r$ with a constant speed $v$. A fox chases the rabbit, starting from the center of the circle and also moves with a constant speed $v$ such that it is always between the center of the circle and the rabbit. How long will it take for the fox to catch the rabbit?
I tried using the fact that $|x'(t)| = |y'(t)| = v$ where $x(t)$ is fox's position and $y(t)$ is rabbit's position, and that $x'(t)x''(t) = 0$ because of constant speed restriction, but I'm still failing to find a solution.
Anyone feel like attacking this one?