Shared Flashcard Set

Details

Linear Algebra - Test 2
Terms for the second third of an introductory linear algebra class
32
Mathematics
Undergraduate 1
04/07/2014

Additional Mathematics Flashcards

 


 

Cards

Term
matrix
Definition

a rectangular array of numbers called entries / elements

m rows x n columns

indexed a_ii

Term
square matrix
Definition

m = n

(# rows = # cols)

Term

diagonal matrix

 

Definition
all nondiagonal entries are 0
Term
identity matrix
Definition
diagonal matrix where all entries are 1
Term
matrices A and B are equivalent if and only if...
Definition
same size and same entries in each index
Term
adding matrices A, B
Definition

A and B must be same size

 

A + B = [aij + bij]

Term
scalar multiplication of a matrix
Definition
c * A = [c*aij]
Term
zero matrix
Definition
matrix will all elements = 0
Term
matrix multiplication
Definition

cols of matrix A must equal rows of matrix B

(m x n) * (n x r) -> produces m x r matrix

 

A * B = [vector ai o vector bj]

Term
properties of matrix addition / multiplication
Definition

AB != BA (not all the time at least)

 

A(BC) = (AB)C

A(kB) = k(AB)

A(B+C) = AB + AC

(A+B)C = AC + BC

 

IA = AI = A

 

A^3 = A * A * A

Term
transpose of a matrix A
Definition

rowi(A^T) = coli(A)

 

rotate the such that the nth row becomes the nth col of the new matrix

Term
a matrix is symmetric iff
Definition

A = A^T

 

[aij] = [aji]

 

think reflecting across the diagonal

Term
properties of transposing a matrix
Definition

(A^T)^T = A

(kA^T) = k(A^T)

(A^r)^T = (A^T)^r (r > 0)

(A + B)^T = A^T + B^T

(AB)^T = (B^T)(A^T)

 

Term
the inverse of a matrix A
Definition

A is an nxn (square) matrix

 

A^-1 is an nxn matrix such that

 

A(A^-1) = (A^-1)A = I

 

if A is invertible, then A^-1 is unique

Term
the null space of A
Definition

null(A) is the set of all vectors x such that Ax = 0

 

(A times the vector x is the zero vector)

 

or, null(A) = {x | Ax = 0}

 

observation:

A is invertible iff null(A) = {0 vector}

Term
basis of a subspace S
Definition

a set of vectors that

1) spans S

2) is linearly independent

 

e.g. a basis for R2 is {[0 1][1 0]} because it is two linearly independent vectors that lets you get around all of R2 through its linear combinations

Term

finding bases for row(A) and col(A)


(the row and col space of matrix A)

Definition

row(A) = span({nonzero rows of r.r.e.f. of A})

 

col(A) = span{(set of columns in *A* that have leading terms in the r.r.e.f. of A)}

 

WARNING: col(A) != col(r.r.e.f.(A)) (generally)

Term
is a 2x2 matrix [[a b][c d]] invertible?
Definition
iff ad - bc = 0 (the determinant)
Term
elementary matrix
Definition

any matrix performed by performing one elementary row operation on I

 

1) swap two rows

2) multiply a row by a scalar

3) add a multiple of one row to another

Term
fundamental theorem of invertible matrices
Definition

TFAE

 

a) A is invertible

 

b) Ax = b has the unique soltuion x = (A^-1)(b) for all b in R^n

 

c) Ax = b has a unique solution for all b, so x = 0 is the only trivial solution

 

d) [A | 0] has only the trivial solution (no free variables) [A | 0] = [I | 0]

 

e) there are elementary row operations R1 ... Rk that transform A into I

let Ei = the elementary matrix of Ri

Ek ... E3E2E1A = I

A = E1^-1(E2^-1)...(En-1^-1)(Ek^-1). thus, 

 

f) (E1 ... En)^-1 = Ek^-1 ... E1^-1 (invertible)

f) rank(A) = n


g) nullility(A) = 0


h) columns of A are linearly independent


i) columns span R^n


j) columns are a basis of R^n


k) rows of A are linearly independent


l) rows span R^n


m) rows are a basis of R^n



Term
a subspace in Rn is a set of vectors S such that
Definition

1) 0 vector is in S

2) if u, v in S, then u + v in S (closed under addition)

3) if u in S, c in R, then cu in S (closed under scalar multiplication

 

thus, a subspace S is a set of vectors such that you can't use linear combinations to get out

 

equivalent: if u, v i nS and c, d in R then cu + dv in S (closed under linear combinations)

Term
trivial subspace of Rn
Definition
{0 vector}
Term
row space and col space of A
Definition

row(A) = space spanned by rows - R^n

col(A) = space spanned by cols - R^m

Term
the dimension of S
Definition

dim(S) is the number of vectors in a basis of S

 

therefore, dim(col(A)) == dim(row(A))

 

because dim(row(A)) = # nonzero rows in rref(A)

                    = # of leading 1s in rref(A)

                    = dim(col(A))

Term
the rank of a matrix
Definition

rank(A) = dim(row(A))

        = dim(col(A))

 

easy proof:

rank(A) = rank(A^T)

Term
the nullility of A
Definition

nullility(A) = dim(null(A))

 

observe:

nullility(A) = # free variables in rref(A) aug. with 0 vector

 

free variables of rref = basis vectors of null space

Term

S is a subspace of Rn with basis B = {v1 ... vk}

For all vectors in S, there is a unique linear combination

c1v1 + c2v2 + ... + ckvk = v

 

these c's are referred to as what?

Definition

the coordinates of v with respect to B

 

[c1 ... ck] = coordinate vector of v with respect to B

Term
for an nxn matrix A, the determinant of A = ?
Definition

det(A) = a11det(A11) - a12det(A12) +- ... + (-1)^n+1 (a12det(A1n)

 

A11 = matrix A with row 1 col 1 "struck" out

 

summation (i = 1 to n) a1i * C1i

 

^ the cofactor expansion

Term
Laplace expansion theorem
Definition

the determinant of a nxn matrix A = [aij] (n >= 2) can be computed as

 

det(A) = ai1Ci1 + ai2Ci2 + ... + ainCin

(expansion along row i)

 

or

 

det(A) = a1jC1j + a2jC2j + ... + anjCnj

(expansion down column j)

Term
the determinant of an upper/lower triangular matrix is ...
Definition
the product of its diagonal entries
Term
theorem of determinants?
Definition

a) if A has an all-0 row or col, det(A) = 0

 

b) if B comes from swapping two rows of A, then the det(B) = -det(A)

 

c) if A has two equal rows, then det(A) = 0

 

d) if B comes from multiplying a row or col by a constant k, then det(B) = k*det(A)

 

e) if A, B, C are the same except the ith row of column of C is the sum of the ith row/col of A and B then det(C) = det(A) + det(B) (???)

 

f) if B comes from adding a multiple of row i in A to rowj (i != j) then det(A) = det(B)

 

theorem:

if E is an elementary matrix in det(EB) = det(E)det(B)

 

A is invertible iff det(A) != 0

 

Supporting users have an ad free experience!