Tensor can be thought of as generalization vectors but tensor is described in many ways sometimes an array of numbers.
What is the most common and appropriate definition of tensor ?
Tensor can be thought of as generalization vectors but tensor is described in many ways sometimes an array of numbers.
What is the most common and appropriate definition of tensor ?
I favor the abstract definition of a tensor space as a quotient vector space.
We define the tensor product of vector speces $V$ and $W$ over a common base field as the quotient vector space: $$ V \otimes W := F(V\times W)/\sim$$ where $F(Z)$ is a free vector space generated by elements of set $Z$, and $\sim$ is the minimimal equivalence relation such that
- $(v,w)+(v',w) \sim (v+v',w)$ and $(v,w)+(v,w') \sim (v,w+w')$
- $(\lambda v, w) \sim \lambda(v,w) \sim (v,\lambda w)$
This definition can be generalized to define a tensor product of arbitrary number of vector spaces. A tensor is an element of a tensor space. We define $v\otimes w := [(v,w)]_\sim$
I like this definition because from it immediately follow the important arithmetic properties of tensors:
It also shows that a tensor product is uniquely defined and independent of the bases of the vector spaces. But given specific bases of $V$ and $W$, we can easily construct an isomorphism between the abstract tensor product and the array of numbers. We just need to remember that this isomorphism is basis-dependent.