



Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
Brief lecture notes on Linear Algebra, covering topics such as subspaces, basis, dimension, rank, and the relationship between row and column spaces of a matrix. It includes definitions, theorems, and methods for finding a basis of row(A), col(A), and null(A).
What you will learn
Typology: Lecture notes
1 / 6
This page cannot be seen from the preview
Don't miss anything!
Definition. A subspace of Rn^ is any collection S of vectors in Rn^ such that
Remark. Property 1 is needed only to ensure that S is non-empty; for non-empty S property 1 follows from property 3, as 0 ~a = ~ 0.
Theorem 3.19. Let ~v 1 , ~v 2 ,... , ~vk be vectors in Rn. Then span (~v 1 , ~v 2 ,... , ~vk) is a subspace of Rn.
Definition. Let A be an m × n matrix.
If we need to determine if ~b belongs to col(A), this is actually the same problem as whether ~b ∈ span of the columns of A; see the method on p. 9.
If we need to determine if ~b belongs to row(A), then we can apply the same method as above to the columns ~bT^ and col(AT^ ). Another method for the same task is described in Example 3.41 in the Textbook.
Theorem 3.20. Let B be any matrix that is row equivalent to a matrix A. Then row(B) =row(A).
See the theorem on p. 13.
Theorem 3.21. Let A be an m × n matrix and let N be the set of solutions of the homogeneous linear system A~x = ~ 0. Then N is a subspace of Rn.
Definition. Let A be an m × n matrix. The null space of A is the subspace of Rn^ consisting of solutions of the homogeneous linear system A~x = ~ 0. It is denoted by null(A).
Theorem. Let B be any matrix that is row equivalent to a matrix A. Then null(B) =null(A).
This is the Fund. Th. on e.r.o.s, see p. 4.
E.g., the set {[x 1 , x 2 , x 3 ] | x 1 + x 2 + x 3 = 0} is automatically a subspace of R^3 — no need to verify those closedness properties 1, 2, 3, as this is the null space of the homogeneous system x 1 + x 2 + x 3 = 0 (consisting of one equation).
Definition. A basis for a subspace S of Rn^ is a set of vectors in S that
Remark. It can be shown that this definition is equivalent to each of the following two definitions:
Definition ′. A basis for a subspace S of Rn^ is a set of vectors in S that spans S and is minimal with this property (that is, any proper subset does not span S).
Definition ′′. A basis for a subspace S of Rn^ is a set of vectors in S that is linearly independent and is maximal with this property (that is, adding any other vector in S to this subset makes the resulting set linearly dependent).
Method for finding a basis of row (A). Reduce A to r.r.e.f. R by e.r.o.s. (We know row(A) = row(R).) The non-zero rows of R, say, ~b 1 ,... ,~br , form a basis of row(R) = row(A). Indeed, they clearly span row(R), as zero rows contribute nothing. The fact that the non-zero rows are linearly indepen- dent can be seen from columns with leading 1s: in a linear combination∑ ci~bi the coordinate in the column of the 1st leading 1 is c 1 , since there are only zeros above and below this leading 1; also the coordinate in the column of the 2nd leading 1 is c 2 , since there are only zeros above and below this leading 1; and so on. If
ci~bi = ~ 0 , then we must have all ci = 0. Moreover, the same is true for any r.e.f. Q (not necessarily reduced r.e.f.): The non-zero rows of Q form a basis of row(Q)=row(A).
1st Method for finding a basis of col (A). Use the previous method applied to AT^.
For the examination, no need to have proof. But, for the com- pleteness of exposition, I give a proof of existence of basis, Theo- rem 3.23+(a), here.
The existence of basis. Let S be a non-zero subspace (that is, S does not consist of zero vector only) of Rn. Then S has a basis.
Proof. Consider all linearly independent systems of vectors in S. Since S contains a non-zero vector ~v 6 = ~ 0 , there is at least one such system: ~v. Now, if ~v 1 ,... , ~vk is a system of linearly independent vectors in S, we have k 6 n by Theorem 2.8.
We come to a crucial step of the proof: choose a system of linearly in- dependent vectors ~v 1 ,... , ~vk in such way that k is maximal possible and consider U = span(v 1 ,... , vk).
Observe that U ⊆ S. If U = S, then ~v 1 ,... , ~vk is a basis of S by definition of the basis, and our theorem is proven. Therefore we can assume that U 6 = S and chose a vector ~v ∈ S \ U (in S but not in U ).
The rest of proof of Theorem 3.23 can be taken from the text- book.
Definition. If S is a subspace of Rn, then the number of vectors in a basis for S is called the dimension of S, denoted dim S.
Remark. The zero vector ~ 0 by itself is always a subspace of Rn. (Why?) Yet any set containing the zero vector (and, in particular, {~ 0 }) is linearly dependent, so {~ 0 } cannot have a basis. We define dim{~ 0 } to be 0.
Examples. 1) As we know, the n standard unit vectors form a basis of Rn; thus, dim Rn^ = n.
We shall need a slightly more general result:
Theorem 3.23++. (a) If v 1 ,... , vk are linearly independent vectors in a subspace S, then they can be included in (complemented to) a basis of S; in particular, k ≤ dim S.
(b) If one subspace is contained in another, S ⊆ T , then dim S ≤ dim T. If both S ⊆ T and dim S = dim T , then S = T.
Example. If we have some n linearly independent vectors ~v 1 ,... , ~vn in Rn, they must also form a basis of Rn, as the dimension of their span is n and we can apply Theorem 3.23++(b).
Theorem 3.24. The row and column spaces of a matrix A have the same dimension.
Definition The rank of a matrix A is the dimension of its row and column spaces and is denoted by rank(A).
Theorem 3.25. For any matrix A,
rank (AT^ ) = rank (A)
Definition The nullity of a matrix A is the dimension of its null space and is denoted by nullity(A).
Theorem 3.26. The Rank–Nullity Theorem
If A is an m × n matrix, then
rank (A) + nullity (A) = n
Theorem 3.27. The Fundamental Theorem of Invertible Matrices
Let A be an n × n matrix. The following statements are equivalent:
a. A is invertible.
b. A~x = ~b has a unique solution for every ~b in Rn.
c. A~x = ~ 0 has only the trivial solution.
d. The reduced row echelon form of A is In.
e. A is a product of elementary matrices.
f. rank(A)= n.
g. nullity(A)= 0.
h. The column vectors of A are linearly independent.
i. The column vectors of A span Rn.
j. The column vectors of A form a basis for Rn.
k. The row vectors of A are linearly independent.
l. The row vectors of A span Rn.
m. The row vectors of A form a basis for Rn.