Commercial use of this content is strictly prohibited. For more details on licensing policy, please visit the About page.
Einstein summation convertion: When two variables share the same subscript and are positioned one above the other, they are traversed and summed. Such subscripts are referred to as “dummy subscripts” in this context. In this case, the summation symbol may be omitted.
1.Vectors and Tensors in a Finite Dimensional Space
1.1 Real Vector Space
Based on real number set ,
a vector space is a set of vectors that satisfiy:
addition is commutative
addition is associative
addition unit
inverse
multiplication associative
multiplication unit
multiplication distributive:
All vectors in a vector space can be represented by a set of linear independent vectors in the space. This set of vectors is called the basis of the space. The number of vectors of the basis is called the dimension of the space.
Scalar product of two vectors is an operation satisfying:
communitative:
distributive:
scalar multiplication associative:
positive definite:
The magnitude is defined by
In Cartesian coordinate, scalar product can be calculated by
When , the two vectors are considered to be orthogonal.
1.2 Dual Bases
Let be a basis of n-dimensional Euclidian space .
is called the dual bases of if
In Euclidian space, such a basis always exists.
Let denote a set of orthonormal basis of Euclidian space .
Under Einstein summation convention,
Inserting the first relation to the second one
is a set of basis, indicating the vectors in it are linearly independent,
hence
Further let
then
Hence, by constructing with ,
any basis has its dual basis.
To prove linear independence of the new basis,
suppose on the contrary that
Multiple on both sides
That no combinations of these vectors gives 0. Hence they’re linearly independent.
Next comes the uniqueness.
Suppose two dual bases and
We also have
plug in
This gives
Plug back
With the aid of dual bases,
one can represent an arbitrary vector in
where
The two types of components are called contravariance components and covariance components,
respectively.
With this notation,
they scalar product also changes
The magnitude
Under this definition, we must define which basis is the “original” and which is the “dual”.
1.3 Second-Order Tensor as Linear Mapping
A linear mapping in Euclidian space is
is called a second-order tensor.
Transformations represented by a tensor is linear,
satisfying all linear properties.
All such tensors (linear maps) can be included in a space,
denoted as .
This space has unit element ,
it make no change to input vector:
This tensor is called identity tensor.
For example,
the vector product
The tensor
can be considered as a tensor.
The vector product can also be represented by
1.4 Tensor Product
Tensor product enables to construct a 2-order tensor from two vectors.
Consider two vectors ,
an arbitrary vector can be mappedd into another vector
This defines a mapping
We denote
as the tensor product.
In general, to simplify, the symbol is omitted.
The tensor products holds
For the left mapping,
The tensor set can represent a vector space.
Now introduce a theorem
Theorem: Let and be two bases of .
Then tensors represent a basis of . The dimension of is thus .
Pf.
Consider two tensors and ,
and let
Upper script vectors are from the dual bases.
The two tensors equate only when
Expand with basis
Meanwhile
Recall the expansion with vector
Expand with this method
Then and equate.
can be expanded with linear combination of .
The must be linear independent.
Otherwise,
there must be a set of not all zero coefficients that
Let be one of the non-linear coefficient.
Right map by both sides.
This contradict with the fact that forms a basis and therefore linearly independent.
A 2-order tensor can be represented with a basis .
where
This relation is called spectral decomposition theorem in quantum mechanics for operators.
To provide a intuition,
we can say the tensor space is the tensor product of a vector space and its dual space.
For the identity tensor ,
take one form for example
In Euclidian space,
1.5 Basis Transformation
We can represent components with basis .
This is similar for a tensor
With another basis ,
vectors and tensors can be represented with the new basis
1.6 Tensor Operations
Composition:
It satisfies
For scalar mapping
Composition is associative and distributive, but not commutative. So that .
For zero and identity tensor
By component form,
Or you can try other combinations of contravariance and covariance.
Power:
Tensor function can be defined with power,
similar to the Taylor expansion.
For example, tensor exponential
Transposition:
Tranposition represents a reverse of the space,
which can be regarded as operating on the dual space.
If
Then
For the tensor product,
Inversion:
Let ,
expand with components
Suppose the inverse of is ,
then
Further,
,
equate them then we get
Since forms a basis,
the coefficient must be zero
Hence
This indicates
Furthermore,
if a tensor satisfies
Then this tensor is called an orthogonal tensor.
1.7 Scalar Porduct of Tensors
We can define a scalar product of tensors:
For component form,
Scalar product for tensor is also linear.
Consider a scalar product of two same tensors:
By the positive definite we can define the norm of a tensor
If the scalar product and composition are mixed:
We can write it with indices:
Similarly for the next equation and for transpositions:
Obviously we also have
and
This indicates that
1.8 Decomposition of 2-Order Tensors
Any 2-order tensor can be decomposed into an additio of symmetric and skew parts.
The Sym's and Skew's are actually subsets of .
The two subspaces have only one common element .
With components,
symmetric tensors are composed with
because of .
Similarly for skew's, , .
Obviously, a symmetric tensor and a skew tensor are orthogonal
1.9 Metric Tensor
The original and dual bases are connected with metric tensor.
Metric tensor has two types,
defined on original and dual basis, respectively.
By symmstry of scalar product,
,
and the metric tensor is symmetric.
The two types of metric tensors are invertible,
meanwhile,
We can prove it easily:
Metric tensor can also be used to rise or decay indices.
This can be proven by plugging in dual summation for original basis and vice versa.
Write it in the compact form
Under this transformation, the vector does not change. What is changed is only the components, for the basis has been changed. In fact all basis transformation can be operated like this, even if the two bases are not dual.
The connection between the original and dual bases is established through the metric tensor, which exists in both covariant and contravariant forms. These metric tensors are symmetric and mutually inverse, satisfying
This inverse relationship allows the metric tensor to raise and lower indices, transforming between covariant and contravariant components as and .
When considering two arbitrary sets of bases and ,
which are not necessarily dual,
they are related by a transformation matrix such that
The vector can be expressed in both bases as
The contravariant components transform according to , which in matrix form is
To handle the transformation of covariant components, a mixed metric tensor is introduced. This mixed metric tensor relates the covariant components in the basis to the contravariant components in the basis as . Using the definition of and the transformation of the bases, it follows that
where is the covariant metric tensor of the basis.
Thus, the covariant components transform as
where is the matrix representation of .
The metric tensors of the two bases are related through the transformation matrix. The covariant metric tensor of the basis is given by , or in matrix form
Similarly, the contravariant metric tensor transforms as . The mixed metric tensor also has an inverse relation which satisfies , indicating that the two mixed metric tensors are transposes of each other.
In the special case where the basis is the dual of the basis, the transformation matrix simplifies to , and the mixed metric tensor becomes , reducing the transformation to the familiar form
For orthogonal transformations, where is an orthogonal matrix, the metric tensor transforms as preserving inner products.
This framework of using metric tensors and mixed metric tensors provides a consistent method for transforming vector components between arbitrary bases, generalizing the concept beyond dual bases and enabling applications in various coordinate systems and physical contexts.
References:
[1] M. Itskov, Tensor Algebra and Tensor Analysis for Engineers: With Applications to Continuum Mechanics, 6th ed. Cham, Switzerland: Springer Nature Switzerland AG, 2025.