8
$\begingroup$

In my textbook it says that if you multiply a row in a matrix $A$ by a nonzero constant $c$ to obtain $B$, then $\det{B}=c\det{A}$.

Later on it says that if you obtain $B = cA$ by adding $c$ times the $k^{\text{th}}$ row of $A$ to the $j^{\text{th}}$ row, $\det{B}=\det{A}$.

Isn't this a contradiction though? Is not adding $c$ times the $k^{\text{th}}$ row of $A$ to the $j^{\text{th}}$ row equivalent to multiplying the $k^{\text{th}}$ row by $c$, which increases the determinant by a factor of $c$, and then adding the row down?

In other words, is (I) the same as the (II) with the $2$ steps combined?
I. $cR_k + R_j \rightarrow R_j$. $1$ step in total.
II. First, do $cR_k \rightarrow R_k$. Second, do $cR_k + R_j \rightarrow R_j$. $2$ steps in total.

$\endgroup$
1
  • $\begingroup$ @Eric No, of course not. The $k$-th row of matrix you get by doing the operations in your last paragraph is $c$ times what you started with – so you need to divide that row through by $c$ in order to get the matrix in your second paragraph. $\endgroup$
    – Zhen Lin
    Commented Mar 21, 2013 at 18:08

4 Answers 4

6
$\begingroup$

They are not equivalent operations, if I add $c$ times the $2^\text{nd}$ row of $$\begin{bmatrix}1 & 0 \\ 0 & 1\end{bmatrix}$$ to the $1^\text{st}$ I get $$\begin{bmatrix}1 & c \\ 0 & 1\end{bmatrix}.$$ On the other hand if I just multiply the $2^\text{nd}$ row by $c$ I get $$\begin{bmatrix}1 & 0 \\ 0 & c\end{bmatrix}$$ and then adding it to the $1^\text{st}$ gives $$\begin{bmatrix}1 & c \\ 0 & c\end{bmatrix}.$$

$\endgroup$
2
$\begingroup$

the difference is that in the latter case the $k^{th}$ row would also be altered.

what would give the same result, would be to multiply the $k^{th}$ row by $c$, then add it to the $j^{th}$ row, and then divide the $k^{th}$ row by $c$...

but that would lead to $\det A=\frac{1}{c}.c.\det A$...
no contradiction here...

$\endgroup$
0
$\begingroup$

This would only be a contradiction if you manage (through adding linear combinations or rows to other rows as above) to scale one of the rows, without altering others.

The only cases I am aware of where you can do that (e.g. repeating rows upto a constance factor) feature a non-invertible matrix, hence have 0 determinant, so linear scale would not matter.

$\endgroup$
0
$\begingroup$

The point is that the determinant of matrix $A$ is not altered if you add $c$ times row $k$ to row $j$ for $j$ different from $k$.

The reason is that if $j \ne k$, this operation is accomplished by multiplying $A$ on the left by $I + c e_{jk}$, so $B = (I + c e_{jk}) A$. Now $I + c e_{jk}$ has determinant $1$ if $j \ne k$, but determinant $c+1$ if $j = k$.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .