Linear independence Linear transformations
MATH 4A - Linear Algebra with Applications
Lecture 5: Linear independence and linear transformations
10 April 2019
Reading: §1.7-1.9 from Lay, 5th ed.
Recommended problems from §1.7: 1, 3, 5, 7, 9, 11, 13, 21-23, 27,
33, 35, 39
Recommended problems from §1.8: 1, 3, 7, 9, 11, 17, 21, 22, 29
Announcement: you can skip the sections 1.6 and 1.10 on
applications for now. We will double back on them in two weeks.
Linear independence Linear transformations
Lecture plan
1 Linear independence
2 Linear transformations
Linear independence Linear transformations
Motivation
Every linear system in n variables with more than one solution has
infinitely many solutions. Moreover the set of solutions is a
k-dimensional hyperplane in Rn of some dimension k. However, we
don’t yet know how to compute k (except in specific examples
maybe).
Linear independence is a key concept that eventually will allow us
to compute k.
I would argue that linear independence and linear dependence are
the single most important concepts in linear algebra. So pay
attention today!
Linear independence Linear transformations
Definition
A set of vectors {v1 , v2 , . . . , vp } in Rn is linearly independent if the
homogeneous vector equation
x1 v1 + x2 v2 + · · · + xp vp = 0
has only the trivial solution. The set {v1 , v2 , . . . , vp } is linearly
dependent if it is not linearly independent.
Linear independence Linear transformations
Equivalent definitions of linearly dependent
In other words, {v1 , v2 , . . . , vp } is linearly dependent if there exist
weights c1 , c2 , . . . , cp , not all zero, such that
c1 v1 + c2 v2 + · · · + cp vp = 0.
Another equivalent definition: {v1 , v2 , . . . , vp } is linearly
dependent if the homogeneous vector equation
x1 v1 + x2 v2 + · · · + xp vp = 0
has a nontrivial solution.
Linear independence Linear transformations
Linear dependence relations
If the set {v1 , v2 , . . . , vp } is linearly dependent, a choice of
c1 , c2 , . . . , cp , not all zero, such that
c1 v1 + c2 v2 + · · · + cp vp = 0
is called a linear dependence relation among v1 , v2 , . . . , vp .
Linear independence Linear transformations
Small sets: one vector, i.e. p = 1
Let v1 be some vector in Rn . When is the set {v1 } consisting of
only the one vector v1 linearly independent?
Well, if v1 6= 0, the single variable homogeneous linear system
x1 v1 = 0
has no nontrivial solutions. So {v1 } is a linearly independent set if
v1 6= 0.
On the other hand, if v1 = 0, we can plug any scalar in for x1 in
the equation
x1 v1 = 0
and the result is true. Thus the system has nontrivial solutions.
We conclude {v1 } is a linearly dependent set if v1 = 0. Every
choice of x1 6= 0 yields a linear dependence relation.
Linear independence Linear transformations
Small sets: two vectors, i.e. p = 2
Consider a set {v1 , v2 } consisting of two vectors in Rn . When is
this set linearly independent?
Well, we can always rewrite the vector equation
x1 v1 + x2 v2 = 0
as
x1 v1 = −x2 v2 .
Given any nontrivial solution, one of x1 or x2 must be nonzero.
Let’s assume x2 is nonzero. Then we can rewrite the equation as
x1
v2 = − v1 .
x2
Conclusion: {v1 , v2 } is linearly dependent if and only if one of the
vectors is a multiple of the other vector. {v1 , v2 } is linearly
independent if and only if neither vector is a multiple of the other.
Linear independence Linear transformations
In general: linear dependence can be recast in terms of a
homogeneous linear system
More precisely, a linear dependence releation among v1 , v2 , . . . , vp
is exactly the same thing as a nontrivial solution to the
homogeneous system:
Ax = 0
where A is the matrix
A = v1 v2 · · · vp
Thus, in general, we can use matrix methods (i.e. row operations,
reduced echelon forms, etc. ) to decide if a set of vectors is linearly
independent or not.
Linear independence Linear transformations
Linear independence of matrix columns
In other words:
The columns of a matrix A are linearly independent if and only if
the equation Ax = 0 has only the trivial solution.
Linear independence Linear transformations
iClicker 1
Consider the vectors
1 4 2
v1 = 0 v2 = −3 v3 = −3
0 0 0
Is the set {v1 , v2 , v3 } linearly dependent?
(a) Yes
(b) No
(c) Can’t nobody tell me nothin’
(d) You can’t tell me nothin’
(e) I’m gonna ride ’til I can’t no more
Linear independence Linear transformations
Characterizing linearly dependent sets in terms of spans
Theorem
A set {v1 , . . . , vp } of two or more vectors in Rn is linearly
dependent if and only if at least one of the vectors is a linear
combination of the others. In fact, if the vectors are linearly
dependent and v1 6= 0, then some vj (with j > 1) is a linear
combination of the preceding vectors v1 , . . . , vj−1 .
In other words, assuming v1 6= 0, then the set {v1 , . . . , vp } is
linearly dependent if and only if some vj (with 1 < j ≤ p) is in
Span{v1 , . . . , vj−1 }.
Linear independence Linear transformations
Example
Linear independence Linear transformations
Linear dependence “for free”
Sometimes it’s pretty easy to see that a set of vectors must be
linearly dependent.
Theorem
Let {v1 , . . . , vp } be a set of vectors in Rn with p > n. Then the
set must be linearly dependent.
Proof: Let A = v1 · · · vp . Then the equation Ax = 0 is a
system of n equations in p variables. Since p > n, there must be a
free variable.
Theorem
Let {v1 , . . . , vp } be a set of vectors in Rn . If at least one of the
vectors vi for some i = 1, . . . , p equals the 0 vector, then the set
{v1 , . . . , vp } is linearly dependent.
Proof: 0v1 + · · · + 0vi−1 + 1vi + 0vi+1 + · · · + 0vp = 0 is a linear
dependence.
Linear independence Linear transformations
iClicker 2
Are the vectors
1 432 1 532
8 1 2 734
234 −347e 3 3 1
−4213π 4 4 1
linearly independent?
(a) Yes
(b) No
Linear independence Linear transformations
iClicker 3
Are the vectors
2 0 1
3 0 1
5 0 1
linearly independent?
(a) Yes
(b) No
Linear independence Linear transformations
iClicker 4
Are the vectors
32 64
4 8
8 16
13 −26
linearly independent?
(a) Yes
(b) No
Linear independence Linear transformations
Matrices as functions
So far, we’ve simply thought of matrices as convenient book
keeping tools. Namely, matrices provide a compact notation for
expressing linear systems.
However, we can also think of a matrix “dynamically:” if A is a
m × n matrix, it “converts” a vector x in Rn into a vector Ax in
Rm . Solving the equation Ax = b is equivalent to asking: which
vectors x get converted into b by A?
Linear independence Linear transformations
Generalities on functions
A transformation (or function or mapping) T from Rn to Rm is a
rule that converts each vector x in Rn to a vector T (x) in Rm .
If T is a transformation from Rn to Rm , we call Rn the domain of
T and Rm the codomain of T .
Given x ∈ Rn , we call the vector T (x) in Rm the image of x under
T.
The set of all vectors in Rm that are the image of some vector in
Rn is called the range of T .
Linear independence Linear transformations
A schematic picture of a transformation
range
T (x)
T
x
Rn
Rm
domain codomain
Linear independence Linear transformations
Matrix transformation example 1: projection
1 0 0
If A = 0 1 0, the transformation x 7→ Ax projects points in
0 0 0
R3 onto the x1 x2 -plane inside of R3 .
Linear independence Linear transformations
Matrix transformation example 1: demonstrating
projection with the room’s projector
The projector takes things (vectors) to their shadow (image) on
the projection screen (x1 x2 -plane). The line pointing from the
screen to the projector bulb is parallel to the x3 axis. (To be
precise, we should assume the projector is infinitely far away from
the screen. . . )
Linear independence Linear transformations
Matrix transformation example 2: here’s a sheep
Linear independence Linear transformations
Matrix transformation example 2: here’s a sheared sheep
Linear independence Linear transformations
Matrix transformation example 2: what just happened?
I applied the transformation
1 .6
0 1
from R2 to R2 . This is an example of a shear transformation.
So, we sheared the sheep. (Pun definitely intended.)
Linear independence Linear transformations
Matrix transformation example 3: dilation and contraction
Let r be a scalar and consider 2
the matrix transformation from R
r 0
to R2 with matrix A = .
0 r
x2 Av1 x2
v1 Av3
v3
A
v4 x1 x1
v2 Av4
Av2
If r ≥ 1, we “dilate” the vectors (that is, we stretch them).
Linear independence Linear transformations
Matrix transformation example 3: dilation and contraction
Let r be a scalar and consider 2
the matrix transformation from R
r 0
to R2 with matrix A = .
0 r
x2 x2
v1
v3 Av1 Av3
A
v4 x1 Av4 x1
v2 Av2
If r ≤ 1, we “contract” the vectors (that is, we shrink them).
Linear independence Linear transformations
Matrix transformations are extremely special because they
are linear
That is, if A is a m × n matrix, then A(u + v) = Au + Av and
A(cv) = c(Av) for all vectors u, v in Rn and all scalars c ∈ R.
This property is so useful, we abstract it out into its own
definition: a transformation T is linear if:
(i) T (u + v) = T (u) + T (v) for all u, v in the domain of T
(ii) T (cu) = cT (u) for all scalars c and all u in the domain of T
Of course, a matrix transformation is always a linear
transformation.
Linear independence Linear transformations
Extremely useful properties of linear transformations
If T is linear, then T (0) = 0.
If T is linear, then T (cu + dv) = cT (u) + dT (v) for all vectors
u, v in the domain of T and all scalars c, d.
More generally, if v1 , v2 , . . . , vp are in the domain of T and
c1 , c2 , . . . , cp are scalars, then
T (c1 v1 + c2 v2 + · · · cp vp ) = c1 T (v1 ) + c2 T (v2 ) + · · · + cp T (vp ).
In words: a linear transformation takes linear combinations to
linear combinations.
Linear independence Linear transformations
iClicker 5
Suppose Tis alinear transformation from 2 1
R to R = R that
1 42
takes u = to 5 and v = to -3. Which of the
32 −4329
following vectors in R2 maps to 2?
(a) 2u + 3v
(b) 0 + 2
(c) 4u − 6v
(d) 2u − 3v
(e) 4u + 6v
Linear independence Linear transformations
Remarks
We don’t actually need to know what u and v are to answer this
question.
0 + 2 doesn’t make sense: 0 is in R2 but 2 is in R. We can’t add
them!
Linear independence Linear transformations
Beware!
Not every transformation from Rn to Rm is linear!
(Although, in linear algebra class, the main object of study is linear
transformations.)
Linear independence Linear transformations
iClicker 6
Only one of the following is always linear. Which is it?
(a) Any transformation that sends 0 to 0
(b) A transformation from R2 to R2 that sends (x1 , 0) to (−10, 0)
for all x1 and (0, x2 ) to (0, 9) for all x2
(c) A transformation from R10 to R = R1 that sends every vector
to (3).
(d) A transformation from R10 to R = R2 that sends every vector
to 0 = (0, 0).
(e) The transformation from R to R that sends x to 2x − 1.