6
$\begingroup$

How to prove that $e^{A \oplus B} = $$e^A \otimes e^B$? Here $A$ and $B$ are $n\times n$ and $m \times m$ matrices, $\otimes$ is the Kronecker product and $\oplus$ is the Kronecker sum: $$ A \oplus B = A\otimes I_m + I_n\otimes B, $$ where $I_m$ and $I_n$ are the identity matrices of size $m\times m$ and $n\times n$, respectively.

EDIT: Actually if you go to the page http://mathworld.wolfram.com/KroneckerSum.html it tells us this property is true.

http://digitalcommons.unf.edu/cgi/viewcontent.cgi?article=1025&context=etd

$\endgroup$
17
  • 1
    $\begingroup$ What's your definition of $\mathrm{e}^A$? (Power series are common, as are other methods.) $\endgroup$ Commented Mar 12, 2014 at 5:32
  • 6
    $\begingroup$ Do $A$ and $B$ commute? More to the point, do you know how to prove this when $A,B$ are numbers? I don't mean "it is a law of exponents that..." The exponential is defined by a power series. Using that definition, do you know how to prove the equality? $\endgroup$ Commented Mar 12, 2014 at 5:36
  • 2
    $\begingroup$ If $[A,B]\ne0$ then an upgrade is needed: the BCH formula. $\endgroup$
    – anon
    Commented Mar 12, 2014 at 5:37
  • 2
    $\begingroup$ @AdamStaples $[A,B]=AB-BA$. $\endgroup$ Commented Mar 12, 2014 at 6:13
  • 3
    $\begingroup$ But the question has not (yet) been fixed. You need to insert (and explain!) notably the symbol $\otimes$ (typed \otimes); anyone reading $AI+IB=A+B$ will say "yes, $AI=A$ and $IB=B$, big deal". Also note that $\oplus$ is not a commutative operation, at face value. $\endgroup$ Commented Mar 12, 2014 at 7:23

5 Answers 5

9
$\begingroup$

What is to be proved is the following: $$ e^{A \otimes I_b +I_a \otimes B} = e^A \otimes e^B~$$ where $I_a,A \in M_n$ , $ I_b, B \in M_m$

This is true because $$ A \otimes I_b~~~~\text{and}~~~~ I_a \otimes B$$ commute, which can be shown by using the so called mixed-product property of the Kronecker product. i.e. $$ (A \otimes B)\cdot (C \otimes D) = (A\cdot C) \otimes (B\cdot D)~$$ Here, $\cdot$ represents the ordinary matrix product.

One can also show that for an arbitrary matrix function $f$, $$f(A\otimes I_b) = f(A)\otimes I_b~~~~\text{and}~~~ f(I_b \otimes A) = I_b \otimes f(A)~.$$ Together with the commutative property mentioned above, you can prove your result.

$\endgroup$
2
  • $\begingroup$ Then we can take into account everyone else's information? I see. To prove the below results would we just write out the full matrices and use the definition of Kronecker products? $\endgroup$ Commented Mar 12, 2014 at 7:37
  • $\begingroup$ The commutative property can be proved using the mixed-product property. $\endgroup$
    – Nana
    Commented Mar 12, 2014 at 7:44
2
$\begingroup$

If $A$ and $B$ are $n\times n$, then by Taylor expansion we have:

$$e^A=\sum_{k=0}^{\infty}\frac{A^k}{k!}$$

Therefore:

$$e^Ae^B=\sum_{k_1=0}^{\infty}\frac{A^{k_1}}{k_1!}\sum_{k_2=0}^{\infty}\frac{B^{k_2}}{k_2!}$$
$$\Rightarrow e^Ae^B=\sum_{k_1=0}^{\infty}\sum_{k_2=0}^{\infty}\frac{A^{k_1}}{k_1!}\frac{B^{k_2}}{k_2!}$$
$$\Rightarrow =\sum_{k_1=0}^{\infty}\sum_{k_2=0}^{\infty}\frac{(k_1+k_2)!}{(k_1+k_2)!}\frac{A^{k_1}}{k_1!}\frac{B^{k_2}}{k_2!}$$

$$\Rightarrow =\sum_{k_1=0}^{\infty}\sum_{k_2=0}^{\infty}\frac{1}{(k_1+k_2)!}\binom{k_1+k_2}{k2}A^{k_1+k_2-k_2}B^{k_2}$$
Set $k=k_1+k2$ $$\Rightarrow =\sum_{k=0}^{\infty}\frac{1}{k!}(A+B)^{k}=e^{A+B}$$

$\endgroup$
3
  • 3
    $\begingroup$ This only works if $A$ and $B$ commute. $\endgroup$ Commented Mar 12, 2014 at 6:46
  • $\begingroup$ Not really, unless $A$ and $B$ commute. Anyway, the question is not about products or sums, as the latest edit indicates. $\endgroup$ Commented Mar 12, 2014 at 6:46
  • $\begingroup$ @AndresCaicedo,Mariano, both of you are right, I had misunderstood the point. $\endgroup$
    – Alt
    Commented Mar 12, 2014 at 7:12
1
$\begingroup$

First and foremost, the result is not true as stated. It is only true of $A$ and $B$ commute, which is a very restrictive condition for matrices.

To handle the commutative case, one can first consider the formal power series case. In the ring $\Bbb Q[[X,Y]]$ of formal power series with rational coefficients in commuting indeterminates $X,Y$, one defines $\exp(X)$, $\exp(Y)$, and $\exp(X+Y)$ by the usual power series, and the identity $\exp(X)\exp(Y)=\exp(X+Y)$ is easily checked by comparing coefficients of an arbitrary monomial in $X,Y$: both series are equal to $\sum_{k,l\geq0}\binom{k+l}k\frac{X^kY^l}{(k+l)!}$.

Now if one restricts to formal power series with more than exponentially decreasing coefficients, substitution of a concrete value (for instance a matrix) for an indeterminate will give an absolutely convergent power series, whose limit assigns a well defined value to the substitution. If $M$ is your ring of matrices (which is also a topolgical $K$-vector space for $K=\Bbb R$ or $K=\Bbb C$), and $A,B\in M$ commute, then the substitutions $X:=A,Y:=B$ define, for the appropriate subring $R\subset\Bbb Q[[X,Y]]$, a continuous ring homomorphism $f:R\to M$, whose image lies in the commutative subring $K[A,B]$ of $M$ generated by $A,B$. This homomorphism then satifies $f(\exp(S))=\exp(f(S))$ (by the definition of matrix exponentiation), so that applying $f$ to $\exp(X)\exp(Y)=\exp(X+Y)$ gives $\exp(A)\exp(B)=\exp(A+B)$.

$\endgroup$
2
  • 1
    $\begingroup$ Actually, the result is true as stated (in particular, $A$ and $B$ need not commute). The reasoning is that $A$ and $B$ are acting on different subspaces (it is a Kronecker sum, not a regular matrix sum). $\endgroup$ Commented Apr 2, 2018 at 14:53
  • $\begingroup$ @PhysicsEnthusiast: I think this question did not read the way it does now at the time I wrote this answer, since I don't address Kronecker sums and products in any way. I'll be happy to delete the answer. $\endgroup$ Commented Apr 3, 2018 at 11:39
0
$\begingroup$

A way to proceed. If $A$ and $B$ commute they are simultaneously diagonalizable (if they are diagonalizable, otherwise one must fall back to Jordan decomposition). For diagonal matrices the formula is easy, because you reduce to the property of exponential for real numbers.

$\endgroup$
1
  • $\begingroup$ If you can do it for diagonal matrices, you can do it for all of them because diagonal matrices are dense. One needs to do this in detail, of course. $\endgroup$ Commented Mar 12, 2014 at 7:38
0
$\begingroup$

Mumble! Gripe! Once again I seem to have answered the pre-edited version of the question! Ah well, at least I can take consolation in the fact that I do not appear to be alone!

I won't attempt to prove the title assertion, because it is false. I will however give a simple counterexample:

Let

$N_1 = \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix} \tag{1}$

and

$N_2 = \begin{bmatrix} 0 & 0 \\ -1 & 0 \end{bmatrix}; \tag{2}$

then we have

$N_1^2 = N_2^2 = 0, \tag{3}$

$N_1 N_2 = -\begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix}, \tag{4}$

and

$N_2 N_1 = -\begin{bmatrix} 0 & 0 \\ 0 & 1 \end{bmatrix}; \tag{5}$

note that

$N_1 N_2 \ne N_2 N_1. \tag{6}$

From (3) it follows that

$e^{N_1} = I + N_1 \tag{7}$

and

$e^{N_2} = I + N_2, \tag{8}$

so that

$e^{N_1} e^{N_2} = (I + N_1)(I + N_2) = I + N_1 + N_2 + N_1 N_2 = \begin{bmatrix} 0 & 1 \\ -1 & 1 \end{bmatrix}, \tag{9}$

as may be seen by a simple calculation using (1), (2), and (4). We also have the matrix $J$:

$J = N_1 + N_2 = \begin{bmatrix} 0 & 1 \\ -1 & 1 \end{bmatrix}; \tag{10}$

we see that

$J^2 = -I. \tag{11}$

Examining $e^J$, we see that

$e^{(N_1 + N_2)} = e^J = \sum_0^\infty \dfrac{J^n}{n!} = I + J + \dfrac{1}{2}J^2 + . . . + \dfrac{1}{n!}J^n + . . . , \tag{12}$

and by virtue of (11) we see that, term-by-term, the power series for $e^J$ corresponds precisely to that for $e^i$, $i^2 = -1$ the ordinary complex number square root of $-1$. This implies that the classic formula $e^{i\theta} = \cos \theta + i \sin \theta$ applies to (12) so that, when $\theta = 1$, we obtain

$e^J = I \cos (1 \; \text{rad}) + J \sin (1 \; \text{rad}) = \begin{bmatrix} \cos (1 \; \text{rad}) & \sin (1 \; \text{rad}) \\ -\sin (1 \; \text{rad}) & \cos (1 \; \text{rad}) \end{bmatrix} \tag{13}$

wherein $1 \; \text{rad} = 1 \; \text{radian}$. We see from these compuations that

$e^{(N_1 + N_2)} = e^J \ne e^{N_1}e^{N_2}. \tag{14}$

In the event that $AB = BA$, however, the title assertion binds, as may be seen by the following simple argument: let $X$ be the unique matrix solution to

$\dot X = (A + B)X, X(0) = I; \tag{15}$

it is easy to see that

$X(t) = e^{(A + B)t}; \tag{16}$

now setting

$Y(t) = e^{At}e^{Bt} \tag{17}$

we see that

$\dot Y = Ae^{At}e^{Bt} + e^{At}Be^{Bt} =$ $Ae^{At}e^{Bt} + Be^{At}e^{Bt} = (A + B)e^{At}e^{Bt} = (A + B)Y(t), \tag{18}$

since $AB = BA$ allows us to write $e^{At}B = Be^{At}$, swapping $B$ with powers $A^k$ of $A$ on a term-by-term basis. Since $X(t)$ and $Y(t)$ satisfy the same ordinary differential equation with the same initial conditions, we have $X(t) = Y(t)$ for all $t$; taking $t = 1$ now establishes the title assertion that

$e^Ae^B = e^{A+B}. \tag{19}$

Hope this helps. Cheerio,

and as always,

Fiat Lux!!!

$\endgroup$
1
  • $\begingroup$ That's fine, keep it. Nana's post may make all of your posts useful for this question. Plus I'd like to learn more about matrix exponentiation, I've never heard of it before. $\endgroup$ Commented Mar 12, 2014 at 7:39

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .