1
$\begingroup$

Consider adding angular momentum. Shankar describes the state of the system as the direct product of states while Ballentine (and I think most other people) describes the state of the system as the tensor product of states. I assume Shankar isn't straight up wrong. Thus, I am a bit confused.

$\endgroup$
3
  • 3
    $\begingroup$ same thing, different name $\endgroup$ Commented Mar 26, 2023 at 22:03
  • $\begingroup$ That's not what google tells me. Of course, they are hopefully the same in this context, but I don't understand how. $\endgroup$
    – EEH
    Commented Mar 26, 2023 at 22:06
  • 1
    $\begingroup$ @EE18 I would reconsider posting my email address on a forum in plain text. $\endgroup$
    – Samuel
    Commented Mar 27, 2023 at 7:13

2 Answers 2

6
$\begingroup$

$$\newcommand{ket}[1]{\left|#1\right>}$$ Physicists tend to be a bit loose with their terminology, for better or worse. The correct notion here is the tensor product space, not the direct/cartesian product space (which is in this case equivalent to the direct sum).

The distinction between $\psi \otimes \phi$ and $\psi \times \phi \equiv \psi \oplus \phi$ is a distinction between the spaces to which those objects belong. Both objects are fundamentally an ordered pair of vectors, but the distinction between the tensor product $\otimes$ and direct product $\times$ / direct sum $\oplus$ has to do with how we combine and scale these ordered pairs via vector addition and scalar multiplication.


Given two vector spaces $V$ and $W$, we may construct the direct product space $V\times W$ whose underlying set consists of the ordered pairs $(v,w)$ with $v\in V$ and $w\in W$. The addition and scalar multiplication operations on $V\times W$ are defined as follows: $$\times\text{-Addition}:\qquad (v,w)+(v',w') = (v+v',w+w')$$ $$\times\text{-Multiplication:}\qquad \lambda \cdot (v,w) = (\lambda v,\lambda w)$$ In practice, we often use the notation $(v,w)\equiv v\oplus w$ or, if we're being even more lax, $(v,w)\equiv v+w$. Note that, for example, $\lambda(v \oplus w) = \lambda v \oplus \lambda w$ is reminiscent of a distributive law.


On the other hand, the tensor product space $V\otimes W$ is a bit more complex. The underlying set consists of all formal linear combinations of elements of $V\times W$; in other words, an element of $V\otimes W$ takes the general form $$\sum_i c_i (v_i,w_i)$$ for some collection of vectors $\{v_i\}$ and $\{w_i\}$ and scalars $\{c_i\}$. The addition and scalar multiplication rules are also different: $$\otimes\text{-Addition:}\qquad (v+v',w) = (v,w)+(v',w) \qquad (v,w+w')= (v,w) + (v,w')$$ $$\otimes\text{-Multiplication:}\qquad \lambda \cdot (v,w) = (\lambda v,w) = (v,\lambda w)$$

In practice, we typically use the notation $(v,w)\equiv v\otimes w$, or occasionally just $vw$ if the tensor product is understood to be implicit (I prefer to write $\otimes$ explicitly while I'm teaching the concept for the first time, and then drop it after it becomes exhausting).

A crucial difference between $V\times W$ and $V\otimes W$ is that every element of $V\times W$ can be expressed as some ordered pair $(v,w)$ by trivially applying the rules of addition and scalar multiplication. This is not true of $V\otimes W$; there's no way to reduce $(v,w) + (v',w')$ down to a single ordered pair for generic $v,v',w,w'$.

Elements of $V\otimes W$ which can be expressed as a single ordered pair are sometimes called simple; the rest are non-simple, and through the lens of quantum mechanics, it is precisely the non-simple state vectors which represent entangled states.


Consider adding angular momentum. Shankar describes the state of the system as the direct product of states while Ballentine (and I think most other people) describes the state of the system as the tensor product of states.

Consider a composite system consisting of two spin-1/2 particles which are fixed in space (so their only degree of freedom is their spin). The corresponding Hilbert space is $\mathbb C^2 \otimes \mathbb C^2$, not $\mathbb C^2\times \mathbb C^2$.

A generic element of the composite space takes the form $$\psi = a \ket{\uparrow\uparrow} + b \ket{\uparrow\downarrow} + c\ket{\downarrow\uparrow} + d\ket{\downarrow\downarrow}$$

where $\ket{\uparrow\downarrow} \equiv \pmatrix{1\\0} \otimes \pmatrix{0\\1}$. The fact that this cannot be condensed down into a single ket is an expression of the fact that we are studying the tensor product space rather than the direct product space. More explicitly, we talk about entangled (non-simple) states of the form $\ket{\uparrow\downarrow}\pm \ket{\downarrow\uparrow}$, which would not exist if we were studying the direct product space.

$\endgroup$
4
  • 1
    $\begingroup$ @hft Indeed, thanks for the catch. $\endgroup$
    – J. Murray
    Commented Mar 28, 2023 at 15:29
  • $\begingroup$ Thank you! This was extremely informative and thorough. I especially apricated that you fully explained the nuances at play (and didn't discard the question out of hand). A point of clarification, you seem to indicate that we treat direct sum of two elements of vector spaces (so vectors) as elements of the cartesian product space of vector spaces. I assume this treatment recoverable from the typical treatment of direct sum of matrices or vector spaces. If so, can you direct me towards a way to demonstrate this for myself (If you want to explain it yourself, I am all for it as well, of course!) $\endgroup$
    – EEH
    Commented Apr 23, 2023 at 20:04
  • 1
    $\begingroup$ @Eric Can you clarify what you mean by the "typical" treatment? If you mean that the direct sum of two matrices is conventionally written $$\mathbf A \oplus \mathbf B = \pmatrix{\mathbf A & 0 \\ 0 & \mathbf B}$$then I would simply say that that's just a different way of writing $(\mathbf A,\mathbf B)$, where the addition and scalar multiplication of those pairs occurs element-wise as described above. Or did you mean something different? $\endgroup$
    – J. Murray
    Commented Apr 23, 2023 at 20:57
  • $\begingroup$ That's exactly what I meant! I was thinking along the lines that you outlined. However, I wanted to make sure I was on the right track before I started playing with the formalism! Thanks again for the help! $\endgroup$
    – EEH
    Commented Apr 23, 2023 at 21:21
1
$\begingroup$

There are a dozen questions on this site beating that horse to death. Might well start here. Note in the lede of Kronecker product it is sometimes called "matrix direct product".

Physicists are practical people and treat m apples and n oranges the same way, whether they are plain sets, or arrayed into a pointless mn-dimensional vector where basis changes would be meaningless.

It is not going to kill you to promote (or, more hidebound, section 3 of this ) a direct product to a tensor product; so, for example, to organize m (e.g. 3) color and n (e.g. 2) isospin degrees of freedom into the same mn-vector, with an additive structure, where each orange is paired with the entire set of m apples, and stacked up into a huge vector of mn apple-orange pairs. (Most physicists, however, feel more comfortable thinking of them as on a parallelogram, with the isospin on the abscissa and the color in the ordinate.) The point is that color and isospin commute, so you will always operate on them by isospin operators and color operators that resolutely ignore each other. As a result, you will never have a meaningful color-isospin operation (which will simplify & shed insight on your system by a change of basis!); you rotate isospin and color always separately and never the twain shall meet.

Angular momentum addition, by contrast, involves the Kronecker product, a tensor product where you combine, e.g., two angular momenta by making a combined mn-vector whose entries are each entries of the first constituent m-vector multiplied by the entire n-vector of the second. The corresponding mn×mn-matrix generator acting on this vector is dubbed the "coproduct", and is symmetric between the two factors, and it is reducible in general, so a basis change (Clebsch-Gordan) reduces it to several block components. So, here, the vector is meaningful and it matters that it is not a mere Cartesian product.

To improve your question, you might dream up a concrete example where the distinction could matter and drive you to a wrong conclusion... a tough challenge. At every stage of your construction keep track of the dimensionalities of the component and total vector spaces.

An outstanding reference that utilizes and implicitly "explains" what you think of as an off-mainstream misuse of the direct product here is Wu-Ki Tung's standard text.

You might illustrate my promotion of $\times$ to $\otimes$ above by considering the only practical example you are likely to come across, representation matrices acting on vectors. Defining the color Lie algebra element as $\mathfrak a$ (an m×m matrix), and the isospin Lie algebra element as $\mathfrak b$ (an n×n matrix), the generic $SU(m)\times SU(n)$ group element is $$e^{{\mathfrak a}\oplus {\mathfrak b}}= e^{{\mathfrak a}}\oplus e^{{\mathfrak b}}, $$ an (m+n)×(m+n) matrix in direct sum blocks, acting on an m+n-vector $v\oplus w$.

However, it doesn't cost you a thing to expand your vector space to mn-vectors by Kronecker-tensoring them, $v\otimes w$, now acted upon by $$ e^{{\mathfrak a}\otimes {\mathbb I} + {\mathbb I} \otimes {\mathfrak b}}= e^{{\mathfrak a}}\otimes e^{{\mathfrak b}}, $$ now a mn ×mn matrix, much bigger, but noticeably cleaner. Note that the two terms in the exponent on the l.h.s. commute with each other!

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.