1
$\begingroup$

Suppose that a function $f: \mathbb{R}^n \to \mathbb{R}^n$ is continuously differentiable on $\mathbb{R}^n$. Further, the Jacobian of $f$ at a point $x_0 \in \mathbb{R}^n$ is non-zero, that is, \begin{equation} \begin{vmatrix} \frac{\partial f_1\left(x_{0}\right)}{\partial x_{1}} & \frac{\partial f_1\left(x_{0}\right)}{\partial x_{2}} & \dots & \frac{\partial f_1\left(x_{0}\right)}{\partial x_{n}} \\ \frac{\partial f_2\left(x_{0}\right)}{\partial x_{1}} & \frac{\partial f_2\left(x_{0}\right)}{\partial x_{2}} & \dots & \frac{\partial f_2\left(x_{0}\right)}{\partial x_{n}} \\ \vdots & \vdots & \ddots & \vdots\\ \frac{\partial f_n\left(x_{0}\right)}{\partial x_{1}} & \frac{\partial f_n\left(x_0\right)}{\partial x_{2}} & \dots & \frac{\partial f_n\left(x_0\right)}{\partial x_{n}}\\ \end{vmatrix} \neq 0. \end{equation} How to prove the following proposition: there exists a neighborhood around $x_{0}$ where the Jacobian of $f$ is non-zero? That is, deduction of the following proposition is needed: \begin{equation} \exists \delta > 0, \forall x \in \mathbb{R}^{n},\lVert x-x_{0} \rVert < \delta \longrightarrow \begin{vmatrix} \frac{\partial f_{1}\left(x\right)}{\partial x_{1}} & \frac{\partial f_{1}\left(x\right)}{\partial x_{2}} & \dots & \frac{\partial f_{1}\left(x\right)}{\partial x_{n}} \\ \frac{\partial f_{2}\left(x\right)}{\partial x_{1}} & \frac{\partial f_{2}\left(x\right)}{\partial x_{2}} & \dots & \frac{\partial f_{2}\left(x\right)}{\partial x_{n}} \\ \vdots & \vdots & \ddots & \vdots\\ \frac{\partial f_{n}\left(x\right)}{\partial x_{1}} & \frac{\partial f_{n}\left(x\right)}{\partial x_{2}} & \dots & \frac{\partial f_{n}\left(x\right)}{\partial x_{n}}\\ \end{vmatrix} \neq 0. \end{equation}

Below is my own thinking and doubts about the proof.

Using the condition that $f$ is continuously differentiable, we have the following set of propositions: \begin{equation} \forall \epsilon > 0, \exists \delta > 0, \forall x \in \mathbb{R}^{n}, \lVert x - x_{0} \rVert < \delta \longrightarrow \left\vert \frac{\partial f_{i}\left(x\right)}{\partial x_{j}} - \frac{\partial f_{i}\left(x_{0}\right)}{\partial x_{j}} \right\vert < \epsilon, i,j = 1,2,\dots,n. \end{equation} We should assume the following: \begin{equation} \forall \delta > 0, \exists x \in \mathbb{R}^{n},\lVert x-x_{0} \rVert < \delta \wedge \begin{vmatrix} \frac{\partial f_{1}\left(x\right)}{\partial x_{1}} & \frac{\partial f_{1}\left(x\right)}{\partial x_{2}} & \dots & \frac{\partial f_{1}\left(x\right)}{\partial x_{n}} \\ \frac{\partial f_{2}\left(x\right)}{\partial x_{1}} & \frac{\partial f_{2}\left(x\right)}{\partial x_{2}} & \dots & \frac{\partial f_{2}\left(x\right)}{\partial x_{n}} \\ \vdots & \vdots & \ddots & \vdots\\ \frac{\partial f_{n}\left(x\right)}{\partial x_{1}} & \frac{\partial f_{n}\left(x\right)}{\partial x_{2}} & \dots & \frac{\partial f_{n}\left(x\right)}{\partial x_{n}}\\ \end{vmatrix} = 0, \end{equation} and draw a contradiction. However, I am not sure what trick to use to relate these partial derivatives to the Jacobian. Expanding the Jacobian using its definition in linear algebra looks really formidable. However, it seems that there is little chance to circumvent expanding it.

$\endgroup$
2
  • 2
    $\begingroup$ I changed $f: \mathbb{R}^n \mapsto \mathbb{R}^n$ to $f: \mathbb{R}^n \to \mathbb{R}^n.$ You used the wrong arrow. The arrow $\text{“} \mapsto \text{”}$ is used (for example) to distinguish between the functions $x\mapsto (x+2y)^2$ and $y \mapsto (x+2y)^3. \qquad$ $\endgroup$ Commented Jan 2, 2021 at 18:52
  • $\begingroup$ @MichaelHardy Thanks for your comment. Now I can write more rigorous symbols. $\endgroup$
    – Ziqi Fan
    Commented Jan 2, 2021 at 19:50

1 Answer 1

2
$\begingroup$

You're going too deep into the details. If you stay at a higher level, you'll realize that the determinant of a square matrix is a polynomial function of the entries of the matrix, and is therefore a continuous function of its inputs. So you can see the determinant of the Jacobian as a composition of two functions: the first one from $\mathbb R^n$ to $\mathbb R^{n \times n}$ sends a vector to the Jacobian matrix, and this is a continuous function because each partial derivative is continuous by assumption; and the second one sends a matrix to its determinant, which is continuous because it is polynomial in the entries.

So if the determinant of the Jacobian is non-zero at a point, you can find a neighborhood of that point where the determinant remains say positive (if the value of the point was positive) or negative (in the other case).

Hope that helps,

$\endgroup$
1
  • $\begingroup$ This is a great explanation. Thanks. $\endgroup$
    – Ziqi Fan
    Commented Jan 2, 2021 at 18:26

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .