Shared Flashcard Set

Details

Linear Algebra
Theorems needed for Exam 2
16
Mathematics
Undergraduate 3
04/25/2009

Additional Mathematics Flashcards

 


 

Cards

Term
Theorem 2.12
Definition
The null space of an m x n matrix A is a subspace of R^n. Equivalently, the set of all solutions to a system Ax = 0 of m homogeneous linear equations in n unknowns is a subspace of R^n.
Term
Theorem 2.13
Definition
The pivot columns of a matrix A form a basis for the column space of A.
Term
Theorem 2.14 (Rank Theorem)
Definition
If a matrix A has n columns, then rankA + dim NulA = n.
Term
Theorem 2.15 (The Basis Theorem)
Definition
Let H be a p-dimensional subspace of R^n. Any linearly independent set of exactly p elements in H is automatically a basis for H. Also, any set of p elements of H that spans H is automatically a basis for H.
Term
Theorem 5.1
Definition
The eigenvalues of a triangular matrix are the entries on its main diagonal.
Term
Theorem 5.2
Definition
If v1,...,vr are eigenvectors that correspond to distinct eigenvalues &1,...,&r of an n x n matrix A, then the set {v1,...,vr} is linearly independent.
Term
Theorem 5.4
Definition
If n x n matrices A and B are similar, then they have the same characteristic polynomial and hence the same eigenvalues (with the same multiplicities).
Term
Theorem 5.5 (The Diagonalization Theorem)
Definition
An n x n matrix A is diagonalizable if and only if A has n linearly independent eigenvectors. In fact, A = PDP^(-1), with D a diagonal matrix, if and only if the columns of P are n linearly independent eigenvectors of A. In this case, the digaonal entries of D are eigenvalues of A that correspond, repectively, to the eigenvectors of P.
Term
Theorem 5.6
Definition
An n x n matrix with n distinct eigenvalues is diagonalizable.
Term
Theorem 5.9
Definition
Let A be a real 2 x 2 matrix with a complex eigenvalue & = a-bi (b=/0) and an associated eigenvector v in C^2. Then, A = PCP^(-1) where P = [Re v Im v] and C = {a, -b, b, a}.
Term
Theorem 6.2 (Pythagorean Theorem)
Definition
Two vectors u and v are orthogonal if and only if ||u + v||^2 = ||u||^2 + ||v||^2.
Term
Theorem 6.3
Definition
Let A be an m x n matrix. The orthogonal complement of the row space of A is the nullspace of A and the orthogonal complement of the column space of A is the nullspace of A^(perp): (Row A)^(perp) = NulA and (Col A_^(perp) = Nul A^(perp).
Term
Theorem 6.4
Definition
If S = {u1,...,up} is an orthogonal set of nonzero vectors in R^n, then S is linearly independent and hence is a basis for the subspace spanned by S.
Term
Theorem 6.5
Definition
Let {u1,...,up} be an orthogonal basis for a subspace W of R^n. For each y in W, the weights in the linear combination y = c1u1+...+cpup are given by cj = (y dot uj)/(uj dot uj) (j = 1,...,p).
Term
Theorem 6.6
Definition
An m x n matrix U has orthonormal columns if an donly if (U^T)U = 1
Term
Theorem 6.7
Definition
Le U be an m x n matrix with orthonormal columns, and let x and y be in R^n. Then:
(a) ||Ux|| = ||x||
(b) (Ux) dot (Uy) = x dot y
(c) (Ux) dot (Uy) = 0 if and only if x dot y = 0.
Supporting users have an ad free experience!