0
$\begingroup$

I've been working on a problem for my classical mechanics 2 course and I am stuck on a little math problem. Basically, I am trying to prove this equation of motion with a Lagrangian:

$$m\ddot{r} = F + 2m\dot{r} \times \Omega + m(\Omega \times r) \times \Omega.$$

First I've found the Lagragian in the inertial frame which I believe to just be $$L' = \frac{1}{2}m\dot{r}'^2 - U.$$ I know that $v' = v + \Omega \times r$ so inserting that in I believe the non-inertial frame should be $$L = \frac{1}{2}m(\dot{r} + \Omega \times r)^2 - U$$ but when taking the partial derivative with respect to $r$ I don't know how to find the derivative of $\Omega \times r$. I'm only asking for answer to math but if it helps the origin is the same for both frames and omega is constant.

$\endgroup$

3 Answers 3

3
$\begingroup$

Wikipedia's got the answer here : https://en.wikipedia.org/wiki/Cross_product

The product rule of differential calculus applies to any bilinear operation, and therefore also to the cross product: \begin{equation} \frac{d}{dt}(a\times b) = \frac{da}{dt}\times b+a\times\frac{db}{dt} \end{equation}

$\endgroup$
2
  • $\begingroup$ This is with respect to time. I'm talking about taking derivative with respect to a variable that is also inside the cross product. $\frac{d}{dr}(\Omega \times r)$. I didn't think this still held in this scenario, does it? $\endgroup$
    – maxxslatt
    Commented Aug 26, 2020 at 18:28
  • $\begingroup$ @maxxslatt If you use index notation, it might clear up your confusion (no more cross product!), and also make it clear what $\partial\vec r/\partial\vec r$ means. $\endgroup$
    – G. Smith
    Commented Aug 26, 2020 at 20:10
1
$\begingroup$

Let $G:\Bbb{R}^3\to \Bbb{R}^3$ be defined as $G(r) = \Omega \times r$. Note that $G$ is a linear transformation in the sense of linear-algebra ($G(cr_1 + r_2) = c G(r_1) + G(r_2)$ for all $c\in \Bbb{R}, r_1,r_2\in \Bbb{R}^3$). So, at any point $a\in\Bbb{R}^3$, its derivative (or total derivative, or Frechet derivative whatever terminology you prefer most) is simply $dG_a(\cdot) = G(\cdot)$. In other words, at every point, $G$ is its own derivative.


Really Explicit and Systematic Approach To Differentiation.

But, this by itself is probably not very helpful for you in simplifying the math. I'll first do this the systematic way, and then later you can see the "quick way" (which is still rigorous if everything is interpreted properly). You have a Lagrangian $L:\Bbb{R}^3\times \Bbb{R}^3\to \Bbb{R}$ defined as \begin{align} L(a,b) &= \dfrac{m}{2}\lVert b + \Omega \times a\rVert^2 - U(a) \tag{$*$} \end{align}

Now, to be precise, we shall define the following functions:

  • $r:\Bbb{R}^3\times \Bbb{R}^3\to \Bbb{R}^3$, $r(a,b) := a$. In words, $r$ is that function which eats a pair of vectors and spits out the first vector

  • $v:\Bbb{R}^3\times \Bbb{R}^3\to \Bbb{R}^3$, $v(a,b) := b$. So, $v$ is that function which eats a pair of vectors and spits out the second vector.

With this, we can rewrite $(*)$ as follows: \begin{align} L &= \frac{m}{2}\lVert v + \Omega \times r \rVert^2 - U\circ r \end{align} (The only reason I explicitly defined these functions is so that we have a proper, mathematically correct equality between functions; where both the LHS and RHS are functions $\Bbb{R}^3\times \Bbb{R}^3\to \Bbb{R}$, and they agree at every point of their domain).

Now, here are the "rules" of differentiation you need to know:

  • If $T:V\to W$ is a linear transformation between finite-dimensional spaces, then for every point $a\in V$, we have $dT_a = T$. In words, this says a linear transformation is its own derivative.

  • Chain Rule: if $f:V\to W$ and $g:W\to X$ are differentiable functions, then for every $a\in V$, we have $d(g\circ f)_a = dg_{f(a)} \circ df_a$.

  • A "generalized product rule". Essentially, it amounts to "keep the first, differentiate the second plus differentiate the first keep the second"; but see the link for the precise statement.

So, for example, since $r(a,b) = a$ is a linear transformation, we have $dr_{(a,b)} = r$; or even more explicitly, for all $(a,b), (\xi,\eta)\in \Bbb{R}^3\times \Bbb{R}^3$, we have \begin{align} dr_{(a,b)}(\xi,\eta) = r(\xi,\eta) = \xi. \end{align} Similarly, $dv_{(a,b)} = v$, or more explcitly, $dv_{(a,b)}(\xi,\eta) = v(\xi,\eta) = \eta$. Next, by combining this with the chain rule, we have \begin{align} d(U\circ r)_{(a,b)} = dU_{r(a,b)} \circ dr_{(a,b)} = dU_a \circ r \end{align}

Finally, by recalling that $\lVert \xi\rVert^2 = \langle \xi,\xi\rangle$, we can calculate the derivative of the function $\lVert v + \Omega \times r\rVert^2$, and this is for any $(a,b)\in \Bbb{R}^3\times \Bbb{R}^3$, \begin{align} d\left(\lVert v + \Omega \times r\rVert^2\right)_{(a,b)}(\xi,\eta) &= 2 \langle (v + \Omega \times r)(a,b), d(v + \Omega \times r)_{(a,b)}(\xi,\eta)\rangle \\ &= 2\langle b + \Omega \times a, dv_{(a,b)}(\xi,\eta) + \Omega \times dr_{(a,b)}(\xi,\eta) \rangle \\ &= 2\langle b + \Omega \times a, \eta + \Omega \times \xi \rangle \\ &= 2 \bigg(\langle b + \Omega \times a, \eta\rangle + \langle b + \Omega \times a, \Omega \times \xi\rangle \bigg) \end{align} Now, we use the well-known identity $\langle \alpha, \beta\times \gamma\rangle = \langle \alpha\times \beta, \gamma \rangle$ (i.e $\alpha \cdot (\beta\times \gamma) = (\alpha\times \beta)\cdot \gamma$) to the last term to get \begin{align} d\left(\lVert v + \Omega \times r\rVert^2\right)_{(a,b)}(\xi,\eta) &= 2 \bigg( \langle (b + \Omega \times a)\times \Omega, \xi\rangle + \langle b + \Omega \times a, \eta\rangle \bigg) \end{align}

So, if you combine everything I've said so far, we have that \begin{align} dL_{(a,b)}(\xi,\eta) &= m \bigg( \langle (b + \Omega \times a)\times \Omega, \xi\rangle + \langle b + \Omega \times a, \eta\rangle \bigg) - dU_a(\xi) \end{align}

Now, recall that the gradient is defined as the vector $\nabla U(a)$ such that for all $\xi \in \Bbb{R}^3$, \begin{align} \langle \nabla U(a), \xi\rangle = dU_a(\xi). \end{align}

Hence, the equation becomes \begin{align} dL_{(a,b)}(\xi,\eta) &= \langle m [(b + \Omega \times a)\times \Omega] - \nabla U(a),\,\, \xi\rangle + m\langle b + \Omega \times a, \eta \rangle \tag{$\ddot{\smile}$} \end{align}

In other words, the way to interpret this result is that \begin{align} \begin{cases} \frac{\partial L}{\partial r}(a,b) &= m [(b + \Omega \times a)\times \Omega] - \nabla U(a)\\ \frac{\partial L}{\partial v}(a,b) &= m(b+ \Omega \times a) \tag{i} \end{cases} \end{align}

Or if you don't plug in the points of evaluation, then we can rewrite this result as:

\begin{align} \begin{cases} \frac{\partial L}{\partial r} &= m [(v + \Omega \times r)\times \Omega] - (\nabla U) \circ r\\ \frac{\partial L}{\partial v} &= m\left(v+ \Omega \times r\right) \tag{ii} \end{cases} \end{align} and in this form it's a proper equality of functions $\Bbb{R}^3\times \Bbb{R}^3 \to \Bbb{R}^3$ (and in this notation, things are also probably more familiar to you). Now, I'm sure you can apply the Euler-Lagrange equations, and rearrange terms appropriately to get the desired equations of motion.


"Quicker" Way of Performing the Differentiation

Now, the "quick way" of deriving the result is to simply not include the point of evaluation $(a,b)$ of the derivatives, and not include the vectors $(\xi,\eta)$ on which the derivative is evaluated. Also, let's write the inner product using a $\cdot$ rather than $\langle \cdot, \cdot\rangle$. Then, from \begin{align} L&= \frac{m}{2}\lVert v+ \Omega \times r\rVert^2 - U\circ r, \end{align} we get \begin{align} dL &= \frac{m}{2} \cdot 2 (v+ \Omega \times r)\cdot (dv + \Omega \times dr) - d(U\circ r) \\ &= m \bigg( (v+ \Omega \times r) \cdot (\Omega \times dr) + (v+ \Omega \times r) \cdot dv\bigg) - [(\nabla U)\circ r]\cdot dr \\ &= m \bigg( [(v+ \Omega \times r)\times \Omega] \cdot dr + (v+ \Omega \times r) \cdot dv\bigg) - [(\nabla U)\circ r]\cdot dr \\ &= \bigg(m [(v+ \Omega \times r)\times \Omega] - \nabla U \circ r\bigg) \cdot dr + m (v+ \Omega \times r) \cdot dv \tag{$\ddot{\smile}\ddot{\smile}$} \end{align}

Now, note that $(\ddot{\smile})$ and $(\ddot{\smile}\ddot{\smile})$ say the exact same thing because if you evaluate $(\ddot{\smile}\ddot{\smile})$ on $(a,b)$ and then on $(\xi,\eta)$ (and you revert back to the $\langle \cdot, \cdot \rangle$ notation) we get the exact same thing. Hence, we can easily read off the derivatives in (i) and (ii).

(By now you should be able to justify each equal sign carefully, and also interpret each equality as a proper equality of functions with appropriate domain and target space, and know where various things need to be plugged in to make an evaluation. By the way, this pretty much what Landau and Lifshitz do in Volume $1$, page $128$).

$\endgroup$
-1
$\begingroup$

As far as I'm concerned I was taught, that the beauty of the lagrangian-formalism lies within the reduction of the dimension of the problem by choosing adequate generalized coordinates. The second equation you've written is one dimensional it's only depending on $\dot{r}$ the derivative of $|\vec{r}|=r$ therefore it's pretty easy to differentiate $L$ with respect to $r=|\vec{r}|$ or with respect to $\dot{r}=|\dot{\vec{r}}|$.

But your third equation is different it's no longer one-dimensional. I think your mistake is, that while $\vec{v}'=\vec{v}+\vec{\Omega}\times \vec{r}$ is true you can't apply this to $v$. I think what you need to find out is whether $v'=|\vec{v}'|$ can be differentiated with respect to $r$ or not. Therefore it's maybe not even necessary to differentiate the cross product because what you really want to differentiate is the dot product $\vec{v}'\cdot\vec{v}'$.

$\endgroup$