Tensor product vs direct product vs Cartesian product
22 Jun 2016One often writes $f : A \times B \rightarrow C$ meaning that the argument of the function $f$ is a tuple $(a, b)$, where $a \in A$ and $b \in B$, and function values lie in $C$. This is a typical use case of the Cartesian product. But is it the same product as in the definition of the three-dimensional vector space $\mathbb{R}^3 = \mathbb{R} \times \mathbb{R} \times \mathbb{R}$? Turns out it is not. Here $\times$ means direct product instead. Finally, you might have encountered the tensor product of vectors $a \otimes b = ab^T$, where $a \in \mathbb{R}^m$ and $b \in \mathbb{R}^n$, which is again different from the other kinds of products. So, when to use what? Let’s look at them one by one, and everything will become clear.
Cartesian product. This is the simplest of the operations we are going to consider. It takes multiple sets and returns a set. No structure on the sets is assumed. For example, if $A$ and $B$ are sets, their Cartesian product $C$ consists of all ordered pairs $(a, b)$ where $a \in A$ and $b \in B$,
Direct product. If sets $A$ and $B$ carry some algebraic structure (e.g. they are groups), then we can define a suitable structure on the product set as well. So, direct product is like Cartesian product, but with some additional structure. For example, if $(A, \cdot)$ and $(B, \cdot)$ are groups, their direct product $(A \times B, \ast)$ forms a group with respect to element-wise multiplication
Direct product is closely related to direct sum. Namely, if the number of operands is finite, they are just the same thing. So, if you see $A \oplus B$ or $A \times B$, know that it’s the same thing. The choice of the symbol is usually dictated by the kind of group operation used (addition or multiplication).
Tensor product. This is a different beast. The motivation for introducing tensor product comes from the study of multilinear maps (see How to Conquer Tensorphobia and How to lose your fear of tensor products). Tensor product can be applied to a great variety of objects and structures, including vectors, matrices, tensors, vector spaces, algebras, topological vector spaces, and modules among others. The most familiar case is, perhaps, when $A = \mathbb{R}^m$ and $B = \mathbb{R}^n$. Then, $A \otimes B = \mathbb{R}^m \otimes \mathbb{R}^n \cong \mathbb{R}^{mn}$. If a basis in $A$ is $e_1, \dots, e_m$, and a basis in $B$ is $f_1, \dots, f_n$, then $A \otimes B$ has a basis $e_i \otimes f_j$ for $i=1, \dots, m$ and $j=1, \dots, n$. That’s how one could build a basis in the space of $m \times n$ matrices of rank $1$, for example.