User:Prof McCarthy/Linear independence

Evaluating linear dependence edit

Vectors in R2 edit

Three vectors: Consider the set of vectors v1= (1, 1), v2= (-3, 2) and v3= (2, 4), then the condition for linear dependence is a set of non-zero scalars, such that

 

or

 

Row reduce this matrix equation by subtracting the first equation from the second to obtain,

 

Continue the row reduction by (i) dividing the second equation by 5, and then (ii) multiplying by 3 and adding to the first equation, that is

 

We can now rearrange this equation to obtain

 

which shows that non-zero ai exist so v3= (2, 4) can be defined in terms of v1= (1, 1), v2= (-3, 2). Thus, the three vectors are linearly dependent.

Two vectors: Now consider the linear dependence of the two vectors v1= (1, 1), v2= (-3, 2), and check,

 

or

 

The same row reduction presented above yields

 

which shows that non-zero ai do not exist so v1= (1, 1) and v2= (-3, 2) are linearly independent.

Alternative method using determinants edit

An alternative method uses the fact that n vectors in   are linearly independent if and only if the determinant of the matrix formed by taking the vectors as its columns is non-zero.

In this case, the matrix formed by the vectors is

 

We may write a linear combination of the columns as

 

We are interested in whether AΛ = 0 for some nonzero vector Λ. This depends on the determinant of A, which is

 

Since the determinant is non-zero, the vectors (1, 1) and (−3, 2) are linearly independent.

Otherwise, suppose we have m vectors of n coordinates, with m < n. Then A is an n×m matrix and Λ is a column vector with m entries, and we are again interested in AΛ = 0. As we saw previously, this is equivalent to a list of n equations. Consider the first m rows of A, the first m equations; any solution of the full list of equations must also be true of the reduced list. In fact, if 〈i1,...,im〉 is any list of m rows, then the equation must be true for those rows.

 

Furthermore, the reverse is true. That is, we can test whether the m vectors are linearly dependent by testing whether

 

for all possible lists of m rows. (In case m = n, this requires only one determinant, as above. If m > n, then it is a theorem that the vectors must be linearly dependent.) This fact is valuable for theory; in practical calculations more efficient methods are available.

Example II edit

Let V = Rn and consider the following elements in V:

 

Then e1, e2, ..., en are linearly independent.

Proof edit

Suppose that a1, a2, ..., an are elements of R such that

 

Since

 

then ai = 0 for all i in {1, ..., n}.

Example III edit

Let V be the vector space of all functions of a real variable t. Then the functions et and e2t in V are linearly independent.

Proof edit

Suppose a and b are two real numbers such that

aet + be2t = 0

for all values of t. We need to show that a = 0 and b = 0. In order to do this, we divide through by et (which is never zero) and subtract to obtain

bet = −a.

In other words, the function bet must be independent of t, which only occurs when b = 0. It follows that a is also zero.

Example IV edit

The following vectors in R4 are linearly dependent.

 

Proof edit

We need to find not-all-zero scalars  ,   and   such that

 

Forming the simultaneous equations:

 

we can solve (using, for example, Gaussian elimination) to obtain:

 

where   can be chosen arbitrarily.

Since these are nontrivial results, the vectors are linearly dependent.