Vectors Tensors
Vectors Tensors
Downloaded from https://academic.oup.com/book/43650/chapter/365066176 by Indian Inst of Tech Madras user on 12 August 2025
The theories of Solid Mechanics are mathematical in nature. Thus in this chapter and the next,
we begin with a discussion of the essential mathematical definitions and operations regarding
vectors and tensors needed to discuss the major topics covered in this book.1
The approach taken is a pragmatic one, without an overly formal emphasis upon mathemati-
cal proofs. The end-goal is to develop, in a reasonably systematic way, the essential mathematics
of vectors and tensors needed to discuss the major aspects of mechanics of solids. The physical
quantities that we will encounter in this text will be of three basic types: scalars, vectors, and
tensors. Scalars are quantities that can be described simply by numbers, for example things like
temperature and density. Vectors are quantities like velocity and displacement—quantities that
have both magnitude and direction. Tensors will be used to describe more complex quantities
like stress and strain, as well as material properties. In this chapter we deal with the algebra of
vectors and tensors. In the next chapter we deal with their calculus.
1
The mathematical preliminaries in this chapter and the next are largely based on the clear expositions by Gurtin
in Gurtin (1981) and Gurtin et al. (2010).
Continuum Mechanics of Solids. Lallit Anand and Sanjay Govindjee, Oxford University Press (2020).
© Lallit Anand and Sanjay Govindjee, 2020.
DOI: 10.1093/oso/9780198864721.001.0001
4 VECTORS AND TENSORS: ALGEBRA
Downloaded from https://academic.oup.com/book/43650/chapter/365066176 by Indian Inst of Tech Madras user on 12 August 2025
and hence has the value +1, −1, or 0 when {i, j, k} is an even permutation, an odd permuta-
tion, or not a permutation of {1, 2, 3}, respectively. Where convenient, we will use the compact
notation
def
{ei } = {e1 , e2 , e3 },
to denote our positively oriented orthonormal basis.
2
The alternating symbol is sometimes called the permutation symbol.
CARTESIAN COORDINATE FRAMES. KRONECKER DELTA. ALTERNATING SYMBOL 5
Downloaded from https://academic.oup.com/book/43650/chapter/365066176 by Indian Inst of Tech Madras user on 12 August 2025
Guided by this relation, we define the coordinates of a point x with respect to the origin o by
xi = (x − o) · ei . (1.1.6)
In view of (1.1.4), the inner product of vectors u and v may be expressed as
u · v = (ui ei ) · (vj ej )
= ui vj δij
= ui v i = u1 v 1 + u2 v 2 + u3 v 3 . (1.1.7)
It should also be recalled that this relation is equivalent to u · v = |u| |v| cos θuv , where | · |
denotes the length of a vector (its magnitude or norm) and θuv is the angle between vectors
u and v. Equation (1.1.4) further implies that the cross product of two vectors
u × v = (uj ej ) × (vk ek )
= u j v k (ej × ek )
= eijk uj vk ei . (1.1.8)
In particular, (1.1.8) implies that the vector u × v has the component form
(u × v)i = eijk uj vk . (1.1.9)
This definition of the cross product is fully compatible with the relation |u × v| = |u| |v| sin θuv ,
which indicates that the magnitude of the cross product of two vectors is equal to the area of
the parallelogram whose edges are defined by u and v.
It can be shown that the permutation symbol is related to the Kronecker delta by
⎡ ⎤ ⎡ ⎤
δi1 δi2 δi3 δi1 δj1 δk1
eijk = det ⎣ δj1 δj2 δj3 ⎦ = det ⎣δi2 δj2 δk2 ⎦ , (1.1.10)
δk1 δk2 δk3 δi3 δj3 δk3
where det[·] represents the determinant of a 3 × 3 matrix. A consequence of (1.1.10) is that
⎡ ⎤
δip δiq δir
eijk epqr = det ⎣ δjp δjq δjr ⎦ ,
δkp δkq δkr (1.1.11)
= δip (δjq δkr − δjr δkq ) − δiq (δjp δkr − δjr δkp ) + δir (δjp δkq − δjq δkp ).
6 VECTORS AND TENSORS: ALGEBRA
Some useful identities which follow from successive contractions of (1.1.11), the substitution
property of the Kronecker delta and the identity δii = 3 are
Downloaded from https://academic.oup.com/book/43650/chapter/365066176 by Indian Inst of Tech Madras user on 12 August 2025
(1.1.13)
eijk eijk = 6.
Also useful is the identity
ei = 12 eijk ej × ek . (1.1.14)
Let u, v, and w be vectors. Useful relations involving the inner and cross products then
include,
u · (v × w) = v · (w × u) = w · (u × v),
u · (u × v) = 0,
(1.1.15)
u × v = −v × u,
u × ( v × w) = ( u · w) v − ( u · v ) w.
The operation
def
[u, v, w] = u · (v × w), (1.1.16)
is known as the scalar triple product and is physically equivalent to the (signed) volume of the
parallelepiped whose edges are defined by any three linearly independent vectors u, v, and w.
1.2 Tensors
1.2.1 What is a tensor?
In mathematics the term tensor is a synonym for the phrase “a linear transformation which
maps a vector into a vector.” A tensor S is therefore a linear mapping that assigns to each vector
u a vector
v = Su. (1.2.1)
One might think of a tensor S as a machine with an input and an output: if a vector u is the
input, then the vector v = Su is the output. Linearity of a tensor S is the requirement that
S(u + v) = Su + Sv for all vectors u and v,
(1.2.2)
S(αu) = αSu for all vectors u and scalars α.
TENSORS 7
Downloaded from https://academic.oup.com/book/43650/chapter/365066176 by Indian Inst of Tech Madras user on 12 August 2025
1.2.3 Tensor product
Another example of a tensor is the tensor product3 u ⊗ v, of two vectors u and v, defined by
(u ⊗ v)w = (v · w)u (1.2.3)
for all w; the tensor u ⊗ v maps any vector w onto a scalar multiple of u.
3
Other common terminologies for a tensor product are tensor outer product or simply outer product, as well as,
dyadic product.
8 VECTORS AND TENSORS: ALGEBRA
With this definition of the components of a tensor, the Cartesian component representation of
the relation v = Su is
vi = ei · v = ei · Su
= ei · S(uj ej )
= (ei · Sej )uj
Downloaded from https://academic.oup.com/book/43650/chapter/365066176 by Indian Inst of Tech Madras user on 12 August 2025
= Sij uj . (1.2.6)
It should be observed that (1.2.6) indicates that each component of v is a linear combination
of the components of u, where the weights are given by the components of S.
Further, a tensor S has the representation
S = Sij ei ⊗ ej , (1.2.7)
in terms of its components Sij and the basis tensors ei ⊗ ej .
Downloaded from https://academic.oup.com/book/43650/chapter/365066176 by Indian Inst of Tech Madras user on 12 August 2025
Clearly,
Sij = 1
2 Sij + Sji + 1
2 Sij − Sji .
Thus, every tensor S admits the decomposition
S = sym S + skw S (1.2.12)
into a symmetric part and a skew part, where
sym S = 12 (S + S ),
(1.2.13)
skw S = 12 (S − S ),
with components
(sym S)ij = 12 (Sij + Sji ),
(1.2.14)
(skw S)ij = 12 (Sij − Sji ).
Note that a symmetric tensor has at most six independent components, and a skew tensor has
at most three independent components; the latter point follows since (skw S)ii = 0 (no sum).
Downloaded from https://academic.oup.com/book/43650/chapter/365066176 by Indian Inst of Tech Madras user on 12 August 2025
(ST)v = S(Tv) (1.2.19)
for all vectors v. By (1.2.5), Tej = Tlj el and by (1.2.9), S ei = Sik ek ; thus,
ei · STej = S ei · Tej
= (Sik ek ) · (Tlj el )
The product of the identity tensor 1 with any arbitrary tensor T, is T itself:
(1T)v = 1(Tv) = Tv,
and since this holds for all vectors v we have
1T = T.
Alternatively, in components
(1T)ij = ei · 1Tej
= 1 ei · Tej
= ei · Tej
= Tij .
The components of 1T are equal to the components of T. The two are equal to each other.
TENSORS 11
Downloaded from https://academic.oup.com/book/43650/chapter/365066176 by Indian Inst of Tech Madras user on 12 August 2025
tr(αS + β T) = α tr(S) + β tr(T)
for all tensors S and T and all scalars α and β . Thus, by (1.2.7),
tr S = tr(Sij ei ⊗ ej )
= Sij tr(ei ⊗ ej )
= Sij (ei · ej )
= Sii , (1.2.22)
and the trace is well-defined for all tensors. Some useful properties of the trace are
tr(Su ⊗ v) = v · Su,
tr(S ) = tr S,
(1.2.23)
tr(ST) = tr(TS),
tr 1 = 3.
As a consequence of (1.2.23)2 ,
tr S = 0 whenever S is skew. (1.2.24)
A tensor S is deviatoric (or traceless) if
tr S = 0, (1.2.25)
and we refer to
S ≡ dev S
def
= S − 13 (tr S)1 (1.2.26)
as the deviatoric part of S,4 and to
1
3 (tr S)1 (1.2.27)
as the spherical part of S. Trivially,
S = S − 13 (tr S)1 + 13 (tr S)1,
S s1
4
The notation “dev” is useful for denoting the deviatoric part of the product of many tensors; e.g., dev(ST · · · M).
12 VECTORS AND TENSORS: ALGEBRA
where s = 13 tr S, and it is noted that tr dev S = 0. Thus every tensor S admits the decomposition
S = S + s1 (1.2.28)
into a deviatoric tensor and a spherical tensor.
Downloaded from https://academic.oup.com/book/43650/chapter/365066176 by Indian Inst of Tech Madras user on 12 August 2025
The deviatoric part of a tensor
T = 3e1 ⊗ e1 + 2e1 ⊗ e3
is determined as
T = T − 13 (tr T)1
= (3e1 ⊗ e1 + 2e1 ⊗ e3 ) − 1
3 · 31
= 2e1 ⊗ e1 + 2e1 ⊗ e3 − 1e2 ⊗ e2 − 1e3 ⊗ e3 .
1
The spherical part of T is given by 3 tr T1 = 1.
Consider a symmetric tensor S (Sij = Sji ) and a skew tensor Ω (Ωij = −Ωji ). Their
double contraction S : Ω = Sij Ωij will always be zero:
Sij Ωij = −Sij Ωji = −Sji Ωji = −Sij Ωij .
Downloaded from https://academic.oup.com/book/43650/chapter/365066176 by Indian Inst of Tech Madras user on 12 August 2025
Since the only scalar with the property that it is equal to its negative is zero, we conclude
that Sij Ωij = 0 whenever S is symmetric and Ω is skew.
• All the operations and operators we have introduced for vectors and tensors are in one-
to-one correspondence to the same operations and operators for matrices.
For example,
⎡ ⎤⎡ ⎤
S11 S12 S13 u1
[S][u] = ⎣ S21 S22 S23 ⎦ ⎣ u2 ⎦
S31 S32 S33 u3
⎡ ⎤
S11 u1 + S12 u2 + S13 u3
= ⎣ S21 u1 + S22 u2 + S23 u3 ⎦
S31 u1 + S32 u2 + S33 u3
⎡ ⎤
S u
i 1i i
=⎣ S u ⎦
i 2i i
i S3i ui
= [Su],
so that the action of a tensor on a vector is consistent with that of a 3 × 3 matrix on a 3 × 1
matrix. Further, the matrix [S ] of the transpose S of S is identical to the transposition of the
matrix [S]:
⎡ ⎤
S11 S21 S31
[S ] ≡ [S] = ⎣ S12 S22 S32 ⎦ .
S13 S23 S33
14 VECTORS AND TENSORS: ALGEBRA
Similarly, the trace of a tensor S is equivalent to the conventional definition of this quantity
from matrix algebra
tr S ≡ tr [S] = S11 + S22 + S33 = Skk .
Downloaded from https://academic.oup.com/book/43650/chapter/365066176 by Indian Inst of Tech Madras user on 12 August 2025
u · v ≡ [u] [v] = u1 u2 u3
v3
and the tensor product
⎡
⎤ ⎡ ⎤
u1 u1 v 1 u1 v 2 u1 v 3
[u ⊗ v] ≡ [u][v] = ⎣ u2 ⎦ v1 v2 v3 = ⎣ u2 v 1 u2 v 2 u2 v 3 ⎦ .
u3 u3 v 1 u3 v 2 u3 v 3
If one chooses to perform computations using matrix representations of vectors and tensors,
care must be taken to use a single basis in all calculations, as the matrix representations are
basis dependent.
= S11 (S22 S33 − S23 S32 ) − S12 (S21 S33 − S23 S31 ) + S13 (S21 S32 − S22 S31 )
= eijk Si1 Sj2 Sk3 = 16 eijk epqr Sip Sjq Skr . (1.2.31)
It is useful also to observe that
det(ST) = det S det T, (1.2.32)
and that
det S = det S. (1.2.33)
TENSORS 15
Downloaded from https://academic.oup.com/book/43650/chapter/365066176 by Indian Inst of Tech Madras user on 12 August 2025
Note also that det S−1 = 1/ det S.
e2
e∗2
e1
Downloaded from https://academic.oup.com/book/43650/chapter/365066176 by Indian Inst of Tech Madras user on 12 August 2025
o
e∗3
e∗1
e3
We now consider how the two representations (vi , Sij ) and (vi∗ , Sij
∗
) in the two coordinate
systems are related to each other.
Let Q be the rotation tensor defined by
Downloaded from https://academic.oup.com/book/43650/chapter/365066176 by Indian Inst of Tech Madras user on 12 August 2025
Transformation relation for tensors
For a tensor S the components with respect to the basis {e∗i } are
∗
Sij = e∗i · Se∗j = Qik ek · S Qjl el = Qik Qjl ek · Sel = Qik Qjl Skl .
Thus, the transformation relation for the Cartesian components of a tensor S under a change
of basis are:
∗
Sij = Qik Qjl Skl . (1.2.43)
µ Sµ
For (1.2.45) to possess a non-trivial (μ = 0) solution, the tensor (S −ω1) can not be invertible,
so that, by (1.2.35),
det(S − ω1) = 0.
Thus, the determinant of the 3 × 3 matrix [S] − ω[1] must vanish,
det [S] − ω[1] = 0,
Downloaded from https://academic.oup.com/book/43650/chapter/365066176 by Indian Inst of Tech Madras user on 12 August 2025
which is simply the classic requirement for a system of homogeneous equations to have a non-
trivial solution. Each eigenvalue ω of a tensor S is, therefore, a solution of a polynomial equation
of the form ω 3 − a1 ω 2 + a2 ω − a3 = 0, where the coefficients are functions of S. A tedious
computation shows that this characteristic equation5 has the explicit form
ω 3 − I1 (S)ω 2 + I2 (S)ω − I3 (S) = 0, (1.2.47)
where I1 (S), I2 (S), and I3 (S), called the principal invariants6 of S, are given by
⎫
I1 (S) = tr S, ⎪
⎪
⎪
⎬
2
I2 (S) = 12 tr(S) − tr(S ) ,2 (1.2.48)
⎪
⎪
⎪
⎭
I3 (S) = det S.
Ik (S) are called invariants because of the way they transform under the group of orthogonal
tensors:
Ik (QSQ ) = Ik (S) for any orthogonal tensor Q. (1.2.49)
The solutions of the characteristic equation (1.2.47), cubic in ω , are the eigenvalues ωi ,
i = 1, 2, 3. Since the principal invariants Ik (S) are always real, the theory of polynomials tells
us that
• The characteristic equation has (i) either three real roots (not necessarily distinct), or (ii)
one real and two complex conjugate roots.
Once the eigenvalues have been determined, the eigenvectors corresponding to each eigenvalue
can be found by substituting back into (1.2.45).
5
The Cayley–Hamilton theorem also tells us that a tensor satisfies its characteristic equation; that is
S3 − I1 (S)S2 + I2 (S)S − I3 (S)1 = 0. (1.2.46)
6
The principal invariants can also be defined in terms of any collection of three vectors a, b, c, with [a, b, c] = 0,
via the relations
I1 (S) = [Sa, b, c] + [a, Sb, c] + [a, b, Sc] /[a, b, c],
I2 (S) = [Sa, Sb, c] + [a, Sb, Sc] + [Sa, b, Sc] /[a, b, c],
I3 (S) = [Sa, Sb, Sc] /[a, b, c].
TENSORS 19
Downloaded from https://academic.oup.com/book/43650/chapter/365066176 by Indian Inst of Tech Madras user on 12 August 2025
= Sμ2 · μ1 − ω1 μ2 · μ1
= ω2 μ 2 · μ 1 − ω1 μ 2 · μ 1
= (ω2 − ω1 )μ2 · μ1
Since ω1 = ω2 , we must have
μ2 · μ1 = 0.
Thus, for a symmetric tensor, eigenvectors corresponding to distinct eigenvalues are orthogo-
nal. When the eigenvalues are repeated, the situation is a bit more complex and is treated below.
Notwithstanding, if S is symmetric, with distinct eigenvalues {ω1 > ω2 > ω3 }, then they are
real, and there exists an orthonormal set of corresponding eigenvectors {μ1 , μ2 , μ3 } such
that Sμi = ωi μi (no sum).
Spectral Theorem Let S be symmetric with distinct eigenvalues ωi . Then there is a corre-
sponding orthonormal set {μi } of eigenvectors of S and, what is most important, one uniquely
has that
3
S= ωi μ i ⊗ μ i . (1.2.50)
i=1
The relation (1.2.50), which is called a spectral decomposition of S, gives S as a linear combi-
nation of projections, with each μi ⊗ μi (no sum) a projection tensor onto the eigenvector μi .
Since the eigenvectors {μi } are orthonormal, they can also function as a basis, and in this
basis the matrix representation S is diagonal:
⎡ ⎤
ω1 0 0
[S] = ⎣ 0 ω2 0 ⎦ .
0 0 ω3
When the eigenvalues of S are repeated, the spectral theorem needs to be modified. The
remaining relevant cases are when two of the eigenvalues are the same and when all three
eigenvalues are the same. Respectively, in these cases one has
(a) If ω1 = ω2 = ω3 , then S admits the representation
S = ω1 μ1 ⊗ μ1 + ω2 (1 − μ1 ⊗ μ1 ), (1.2.51)
which indicates that μ1 , as well as any vector orthogonal to μ1 , is an eigenvector of S.
(b) If ω ≡ ω1 = ω2 = ω3 , then S admits the representation
S = ω1, (1.2.52)
which indicates that any vector will qualify as an eigenvector of S and that S is spherical.
20 VECTORS AND TENSORS: ALGEBRA
Downloaded from https://academic.oup.com/book/43650/chapter/365066176 by Indian Inst of Tech Madras user on 12 August 2025
I3 (S) = ω1 ω2 ω3 .
Invariants play an important role in continuum mechanics. As is clear from (1.2.53) for a tensor
S the eigenvalues {ω1 , ω2 , ω3 } are the basic invariants in the sense that any invariant of S can
be expressed in terms of them. In many applications, as an alternate to (1.2.53) it is often more
convenient to choose as invariants the following three symmetric functions of {ω1 , ω2 , ω3 }:
⎫
tr S = ω1 + ω2 + ω3 , ⎪
⎪
⎬
tr S2 = ω12 + ω22 + ω32 , (1.2.54)
⎪
⎪
⎭
tr S3 = ω13 + ω23 + ω33 .
These three quantities are clearly invariant and they are independent in the sense that no one
of them can be expressed in terms of the other two.
As a further alternative set of invariants recall that a tensor S may always be decomposed
into a deviatoric and spherical part as
1
S= S
+ (tr S)1 , with tr S = 0. (1.2.55)
3
deviatoric part
spherical part
The decomposition (1.2.55) leads to the possibility of using the invariant (tr S) and the
invariants of S as an alternative set of invariants. Since S has only two independent non-zero
invariants 2
tr S , tr S3 , (1.2.56)
it is sometimes convenient to adopt the list
tr S, tr S2 , tr S3 (1.2.57)
as a set of alternative invariants of a tensor S.
B = CA. (1.2.58)
TENSORS 21
Downloaded from https://academic.oup.com/book/43650/chapter/365066176 by Indian Inst of Tech Madras user on 12 August 2025
The latter form, C : S, emphasizes the action of C via its component form, which is discussed
next.
Recall that the components Tij of a second-order tensor are defined as
Tij = ei · Tej .
For discussing the component form of fourth-order tensors it is convenient to introduce the
basis tensors
def
Eij = ei ⊗ ej , (1.2.59)
with the orthonormality property
Eij : Ekl = δik δjl .
Using this notation, the components Cijkl of C are defined as
Cijkl = Eij : CEkl . (1.2.60)
This allows one to also express a fourth-order tensor as
C = Cijkl Eij ⊗ Ekl ≡ Cijkl ei ⊗ ej ⊗ ek ⊗ el . (1.2.61)
From these expressions, one can also infer the component-wise action of a fourth-order tensor
on a second-order tensor as
S = CT, Sij = Cijkl Tkl . (1.2.62)
Downloaded from https://academic.oup.com/book/43650/chapter/365066176 by Indian Inst of Tech Madras user on 12 August 2025
Isym T ≡ Isym : T = 1
2 (δik δjl + δil δjk )Eij ⊗ Ekl : (Tpq Epq )
1
= 2 (δik δjl + δil δjk )Tpq (Ekl : Epq )Eij
1
= 2 (δik δjl + δil δjk )Tpq δkp δlq Eij
1
= 2 (δip δjq + δiq δjp )Tpq Eij
1
= 2 (Tij + Tji )Eij
= sym T.
If T is already symmetric, then Isym functions as an alternate identity tensor.