Table of Contents

# Linear Dependence And Linear Independence

In this article we will learn linear dependence and linear independence of vectors.

## Linear Dependence

For a vector space V defined over a field F, the n vectors α_{1}, α_{2}, …, α_{n} ∈ V are said to be linearly dependent if there exists a set of scalars c_{1}, c_{2}, …, c_{n} ∈ F, not all zero (where zero is additive identity of F), such that, c_{1} α_{1} + c_{2} α_{2} + … + c_{n} α_{n} = θ

## Linear Independence

For a vector space V defined over a field F, the n vectors α_{1}, α_{2}, …, α_{n} ∈ V are said to be linearly independent if and only if c_{1} α_{1} + c_{2} α_{2} + … + c_{n} α_{n} = θ, c_{i} ∈ F (i=1, 2, …, n) implies that c_{1 }= c_{2} = … = c_{n} = 0

Example 01 |

The coordinate vectors α_{1}= (1, 1, 0), α_{ 2}= (3, 2, 1) and α_{ 3 }=(2, 1, 1) are linearly dependent if there exists a set of scalars c_{1}, c_{2}, c_{3} not all zero, such that c_{1} (1, 1, 0) + c_{2} (3, 2, 1) + c_{3 }(2, 1, 1) = (0, 0, 0).

This requires that

c_{1} + 3c_{2} +2c_{3} = 0

c_{1} + 2c_{2} +c_{3} = 0

c_{2} + c_{3} = 0

The system of homogeneous linear equations has non zero solution as the rank of the coefficient matrix is 2 (<3). We may also directly solve to check that c_{1} = 1, c_{2} = -1, c_{3} = 1 is a solution to the system. Hence, (1) α_{1} + (-1) α_{2} + (1) α_{3} = θ

Thus vectors α_{1} , α_{2}, α_{3} are linearly dependent and any one of the vectors can be written as a linear combination of the other two. For example, α_{1} = α_{2} – α_{3} .

Example 02 |

The coordinate vectors α_{1}= (3, 2, 1), α_{ 2}= (0, 1, 2) and α_{ 3 }=(1, 0, 2) are linearly independent.

Suppose that c_{1}, c_{2}, c_{3} are scalars such that c_{1} (3, 2, 1) + c_{2} (0, 1, 2) + c_{3 }(1, 0, 2) = (0, 0, 0).

This requires that

3c_{1} + c_{3} = 0

2c_{1} + c_{2} = 0

c_{1} + 2c_{2} + 2c_{3} = 0

It may be checked that the rank of the coefficient matrix is 3 = number of variables. Hence the only solution for the system of equations is c_{1} = c_{2} = c_{3} = 0. We can also directly obtain this solution.

Hence, by definition, it follows that the given vectors are linearly independent.

Theorem 01 |

**A collection of vectors containing null vector is linearly dependent.**

**Proof:**

Let α_{1}, α_{2}, …, α_{r}, θ be the collection of vectors.

We can write 0.α_{1} + 0.α_{2} + … + 0.α_{r} + 1.θ = θ + θ+ …. θ = θ

We see among the numbers 0, 0, …0, 1 at least one, namely 1 is non-zero.

So, α_{1}, α_{2}, …, α_{r}, θ are linearly dependent.

Theorem 02 |

**A collection of vectors which contains a collection of linearly dependent vectors is linearly dependent.**

Example |

The vectors (1, 2, 3), (2, 4, 6), (5, 9, 1), (-6, 7, 8) and (11, 2, 5) are linearly dependent.

We see 2. (1, 2, 3) + (-1). (2, 4, 6) = (0, 0, 0)

So, (1, 2, 3), (2, 4, 6) are linearly dependent. So by above theorem the given five vectors are also linearly dependent.

Theorem 03 |

**Any part of a collection of linearly independent vectors is linearly independent.**

Theorem 04 |

**The number of n-tuples (a _{11}, a_{12}, …, a_{1n}), (a_{21}, a_{22}, …, a_{2n}), …, (a_{n1}, a_{n2}, …, a_{nn}) will be independent if and only if the determinant**

Example |

The three vectors α = (1, 2, 1), β = (2, 3, 1) and γ = (2, 2, 0) are linearly dependent because the determinant

**<< Previous Next>>**