## Colorado State University, Pueblo; Fall 2015 Math 307 — Introduction to Linear Algebra Course Schedule & Homework Assignments

Here is a link back to the course syllabus/policy page.

This schedule is will be changing very frequently, please check it at least every class day, and before starting work on any assignment (in case the content of the assignment has changed).

In the following all reading assignments, sections, and page numbers refer to the required course textbook, Linear Algebra, A Modern Introduction (3rd edition), by David Poole, unless otherwise specified.

If you see the symbol below, it means that class was videoed and you can get a link by e-mailing me. Note that if you know ahead of time that you will miss a class, you should tell me and I will be sure to video that day for you.

Homework for a particular day is due that day, either in class or handed in at my office by 3pm.

#### Week 1

• :
• bureaucracy and introductions.
• what is Linear Algebra (the study of vector spaces and linear transformations...)
• why do we do so much abstraction and formality at this point of the mathematics curriculum: what abstraction is good for
• an example of an application of linear algebra: the $419 billion eigenvalue problem: Google's PageRank algorithm • Read the course syllabus and policy page. • HW0 Send me e-mail (at jonathan@poritz.net) telling me: 1. Your name. 2. Your e-mail address. (Please give me one that you actually check fairly frequently, since I may use it to contact you during the term.) 3. Your year/program/major at CSUP. 4. What you intend to do after CSUP, in so far as you have an idea. 5. Past math classes you've had. 6. The reason you are taking this course. 7. Your favorite mathematical subject. 8. Your favorite mathematical result/theorem/technique/example/problem. 9. Anything else you think I should know (disabilities, employment or other things that take a lot of time, etc.). 10. [Optional:] The name of a good book you have read recently. Please do this some time Monday. [By the way, just to be fair, in case you are interested, here is a version of such a self-introductory e-mail with information as I would fill it out for myself.] • Miniquiz 0 today • : • Read: To the Student, p. xxv&xxvi and §§ 1.1&1.2 • Content: 1. logical and basic set theoretic terminology/notation 2. some basic sets of numbers 1. natural numbers$\NN$2. integers$\ZZ$3. rationals$\QQ$4. real numbers$\RR$3. starting good definitional style, including • all variables must be "bound" • clearly identify the symbol and/or terminology being defined • clearly identify the type of object being defined 4. quantifiers$\forall$and$\exists$5. vectors in$\RR^n$6. vector addtion 7. scalar multiplication 8. the dot product 9. norms 10. some basic properties • of vector arithmetic • of dot products and norms • the triangle inequality • the Cauchy-Schwartz Inequality • Miniquiz 1 today • : • [Re]Read: §§ 1.1 & 1.2 • Content: 1. linear combinations of vectors 2. angles between vectors 3. orthogonal vectors, the notation$\vec{v}\perp\vec{w}$4. unit vectors 5. distances between vectors 6. projection of one vector onto another 7. proof that$\operatorname{proj}_{\vec{u}}(\vec{v})$is the vector on the line along the vector$\vec{u}$which is closest to$\vec{v}$. • Miniquiz 2 today • : • [Re]Read: §§ 1.1-1.3 & 2.1 • Content: 1. going over past miniquizzes and things like the HW problems: • always bind variables, use quantifiers when appropriate (e.g., often$\forall$is in properties, like commutativity, associativity, etc.). • restate the problem more precisely if it helps 2. some other proofs from the exercises of §1.2 — practice proof-writing. e.g., proof that$d(\vec{u},\vec{v})=0$iff$\vec{u}=\vec{v}$. 3. the phrase "if and only if," and synonyms "iff" and "$\Leftrightarrow$" 4. linear equations and systems of linear equations 5. solutions of linear systems 6. examples of linear systems with no solutions, a unique solution, and an infinite number of solutions • Maxiquiz 1 today • Hand in I31 and HW1: 1.2.{34, 60, 62} • Today [Friday] is the last day to add classes. #### Week 2 • : • [Re]Read: §2.1 and Read §2.3 • Content: 1. [in]consistent linear system 2. the solution set of a linear system and its description as intersecting planes 3. the coefficient and augmented matrices of a linear system 4. relationship between a linear system being homogeneous and being consistent 5. linear combination [again] 6. Span, a few basic examples and properties (e.g., the span of a single vector is the line along that vector) 7. result that a linear system is consistent iff the vector consisting of the right hand side constant values is in the span of the columns • Miniquiz 3 today • : • [Re]Read: §2.3 • Content: 1. [non]trivial for both linear combinations and solutions of a linear system 2. some elementary facts about$\Span$such as that each of the vectors$\vec{0},\vec{v}_1,\dots,\vec{v}_k$, is in the set$\Span(\vec{v}_1,\dots,\vec{v}_k)$. 3. linearly [in]dependent vectors — note it is important that the scalars in the definition are not all zero 4. the set$\{\vec{0}\}$is linearly dependent — in fact, any collection of vectors containing the zero vector is linearly dependent. • Miniquiz 4 today • : • [Re]Read: §2.3 • Content: 1. if$\{\vec{v_1},\dots,\vec{v_k}\}$is a linearly dependent set, then one of the$\vec{v_j}$is linear combination of the remaining vectors. 2. more on proofs by contradiction. the structure: • You want to prove$P\Rightarrow Q$. • You assume$P$. • You assume$\sim Q$(read that "not$Q$") • You use those assumptions, logic, calculations, unpacking definitions, prior theorems, etc., to derive a contradiction — some statement$S$and$\sim S$being simultaneously true. We write "$\Rightarrow\Leftarrow$" when the contradiction is found. • At this point, the last assumption you made must be false. Since that assumption was$\sim Q$, it must be instead that$Q$is true. • Miniquiz 5 today • Hand in I32 and HW2: 1.2.70, 2.2.44, 2.3.{20, 44} • : • Read: §§2.2 & 2.3 • Content: 1. some mathematical logic: Start with a statement$S$of the form$P\Rightarrow Q$. Then we define • the converse of$S$is the statement$Q\Rightarrow P$• the contrapositive of$S$is the statement$\sim Q\Rightarrow\sim P$Here are the logical validity of these statements: • The truth of the converse is completely independent of the truth the original statement: either one could be true or false without affecting the other. • The contrapositive of a statement is logically equivalent to the statement. I.e.,$S$is true if and only if its contrapositive is true. Note: this is commonly used logic even in real life. E.g.: it is true that if the Sun goes supernova today, then the Earth will be vaporized tomorrow. Equivalently, if the Earth, tomorrow, is not vaporized, then the Sun must not have gone supernova today! As a consequence, we sometimes choose to prove the contrapositive of a result we want, because this might be easier to approach for some reason. 2. proof that if$\{\vec{v_1},\dots,\vec{v_k}\}$is a linearly independent set, then any subset is also linearly independent -- by proving the contrapositive! 3. defined terms: • elementary row operation (applied to a matrix), shortened to ERO. • a matrix$M$may be in row-echelon form, shortened to "$M$is REF." • a matrix$M$may be in reduced row-echelon form, shortened to "$M$is RREF." • Maxiquiz 2 today ... which became a take-home quiz. Please take this seriously: no consultation with others (including your textbook and the Internet), sit down and work on it at one go — do not work for a while, get up, do other things, come back and do more, etc. #### Week 3 • : • Yes, we do have class today, even though it is the federal holiday celebrating the achievements of workers and the labor movement — if you like the 40 hour work week, now is the time to give thanks. • Maxiquiz 2 is due at the beginning of class. • Today [Monday] is the last day to drop classes without a grade being recorded. • Read: §2.2 • Content: 1. row reduction 2. free variables 3. rank 4. the Rank Theorem 5. the relationship of linear [in]dependence and the linear system whose coefficient matrix has columns which are the vectors under consideration: there will be a non-trivial solution of the homogeneous linear system with that coefficient matrix if and only if the vectors are linearly dependent 6. a term is well-defined if any choices in its definition are mentioned explicitly 7. rank is well defined because matrices always do have an RREF form, and that form is unique; they also always do have an REF form, but that is not unique. 8. necessary linear dependence of$k$vectors in$\RR^n$if$k>n$9. context and type for definitions ... see the handout on definitions for more on this theme • Miniquiz 6 today • : • Read: §§3.1 & 3.2 • Content: 1. definitions for matrices 2. equality for matrices 3. addition/subtraction for matrices 4. scalar multiplication for matrices 5. matrix multiplication — not commutative! 6. the identity matrix 7. elementary matrices 8. transpose of matrices; transposes and multiplication 9. [skew-]symmetric matrices 10. properties of matrix operations • Hand in I33 and HW3: 2.3.48, Chapter Review Exercises p141: 14, 16, 18 • Miniquiz 7 today • : • Read: §§3.3 • Content: 1. the inverse of a matrix 2. an invertible matrix vs non-invertible or singular 3. uniqueness of inverses 4. inverses of elementary matrices 5. inverses of$2\times2$matrices 6. inversions and transposition 7. inverses and solving linear systems 8. the book's Fundamental Theorem of Invertible Matrices • Miniquiz 8 today • : • Read: §§3.3 & 3.5 • Content: 1. inverse of a product of two matrices 2. how about more matrices? We'll need: 3. Proof by induction, or proofs using the Principle of Mathematical Induction, which goes like this: • It only applies to theorems of the specific form "$\forall n\in\NN\ S(n)$is true," where$S(n)$is a mathematical statement which depends upon a natural number parameter$n$. • First one proves that$S(1)$is true; this is called the base case. • Then one proves "If$S(n)$, then$S(n+1)$"; this is called the inductive step and, during the proof of this step, when one invokes the hypothesis$S(n)$, one calls it the inductive hypothesis. • One declares the theorem proven by induction (and goes home happy). 4. an example of an inductive proofs, to show that$\sum_{j=1}^n j = \frac{n(n+1)}{2}$. 5. another example: proving that$\forall n\in\NN$, if$A_1,\dots,A_n$are invertible matrices, then$\left(A_1\cdot\dots\cdot A_n\right)^{-1}=A_n^{-1}\cdot\dots\cdot A_1^{-1}$. 6. definition of a subspace of$\RR^n$: it is a subset$V\subseteq\RR^n$satisfying the properties: 1.$\vec{0}\in S$2.$\forall \vec{u},\vec{v}\in S\ \vec{u}+\vec{v}\in S$3.$\forall \vec{u}\in S,\forall\alpha\in\RR\ \alpha\vec{u}\in S$7. examples of subspaces of$\RR^2$and$\RR^3$: 1. the trivial subspace$\left\{\vec{0}\right\}$2. any line through the origin 3. any plane through the origin 4. the whole thing ($\RR^2$or$\RR^3$as a subspace of itself) • Maxiquiz 3 today #### Week 4 • : • Hand in I34 and HW4: 3.1.38, 3.2.{26, 36}, 3.3.{42, 44} • [Re]Read: §3.5 • Content: 1. Another inductive proof:$\forall n\in\NN$, one can construct a segment of length$\sqrt{n}$with a ruler and compass. 2. here's a nice inductive proof: All pigs are yellow. [see if you can find the flaw in that proof.] 3. going over some recent HW and quizzes 4. the rest of today is subspace day: 5. thinking more about subspaces of$\RR^n$... Notice that the first part of the book's definition is actually not necessary as long as the subset$S$is non-empty: if it has any vector$\vec{u}$at all, then it has$0\cdot\vec{u}=\vec{0}$as well. And the second and third parts of the book's definition are together saying that subspaces are closed under linear combinations. So here is another, equivalent definition of what it means for a subset$S\subseteq\RR^n$to be a subspace: 1.$S\neq\emptyset$(remember the notation for the empty set) 2.$\forall \vec{u},\vec{v}\in S,\forall\alpha,\beta\in\RR\ \ \alpha\vec{u}+\beta\vec{v}\in S$6. filling out our list of examples of subspaces: 1. the trivial subspace$\left\{\vec{0}\right\}$2. any line through the origin 3. any plane through the origin 4. a hyperplane through the origin in higher dimensions (e.g., the set of vectors$\begin{pmatrix}x_1\\x_2\\x_3\\x_4\end{pmatrix}\in\RR^4$with components satisfying$a\,x_1+b\,x_2+c\,x_3+d\,x_4=0$, where$a,b,c,d\in\RR$, is a "three-dimensional hyperplane" in$\RR^4$(think of it as a linear system with only one equation: it has three free variables, so three parameters are needed to specify a point on this hyperplane). 5.$\RR^n$is itself a subspace of$\RR^n$. (Note that any other subspace than this one, so any subspace of$\RR^n$which is not all of$\RR^n$, is called a proper subspace.) 6. Spans are subspaces: Theorem:$\forall n,k\in\NN$and$\forall\vec{v}_1,\dots,\vec{v}_k\in\RR^n$,$\Span(\vec{v}_1,\dots,\vec{v_k})$is a subspace of$\RR^n$7. defining the row space, column space, and null space of a matrix$A$, written$\operatorname{row}(A)$,$\operatorname{col}(A)$, and$\operatorname{null}(A)$8. for an$m\times n$matrix$A$,$\operatorname{row}(A)$and$\operatorname{null}(A)$are vector subspaces of$\RR^n$, while$\operatorname{col}(A)$is a vector subspace of$\RR^m$• Miniquiz 9 today • : • [Re]Read: §3.5 • Content: 1. All pigs are not yellow, alas. 2. going over some recent HW and quizzes 3. the rest of today is basis day: 4. the span of a bunch of vectors is a subspace, but it need not be an efficient way to describe that subspace. Looking for a way to characterize an efficient set of vectors to build a subset, we defined a basis of a subspace of$\RR^n$5. examples of bases: 1. the trivial subspace does not have a basis 2. the standard basis of$\RR^n$(which we have met before; it is the$n$vectors $$\vec{e}_1=\begin{pmatrix}1\\0\\0\\\vdots\\0\\0\end{pmatrix},\vec{e}_2=\begin{pmatrix}0\\1\\0\\\vdots\\0\\0\end{pmatrix},\dots,\vec{e}_n=\begin{pmatrix}0\\0\\0\\\vdots\\0\\1\end{pmatrix},$$ where$\vec{e}_j$is the vector in$\RR^n$which has a$1$in the$j^\text{th}$component and$0$'s everywhere else; the dimension$n$is not part of the notation$\vec{e}_j$, it must be understood from context) is, as the name suggests, a basis. 6. but note that subspaces always have many different bases... (examples) • Miniquiz 10 today • : • [Re]Read: §3.5 • Content: 1. today is dimension day: 2. we've seen that bases are very much not uniquely determined by their subspaces ... but the number of vectors in a basis does seem to be uniquely determined by the subspace (examples); hence we define the dimension of a subspace of$\RR^n$3. dimension is well-defined, says The Basis Theorem: Given a subspace$S$of$\RR^n$, any two bases of$S$have the same number of vectors. 4. the proof of The Basis Theorem ... is pretty 5. from the row space comes the rank 6. the rank of a matrix is also the dimension of its column space 7. from the null space comes the nullity 8. The Rank-Nullity Theorem • Miniquiz 11 today • : • Hand in I35 and HW5: 3.5.{4, 40, 58, 62} [middle two of these are not long or hard proofs, if you use the powerful results in §3.5 of the book; the last one requires some thought and working out details — but it is a very nice proof when you're done!] • [Re]Read: §3.5 • Content: 1. notice that the row space of a matrix doesn't change as we do EROs to the matrix — the column space does change 2. a basis of the row space of a matrix will consist of the non-zero rows of its RREF form. 3. a little on the proof of The Rank[-Nullity] Theorem 4. an example of finding the null space of a matrix 5. please read The Fundamental Theorem of Invertible Matrices over the weekend • notation: TFAE = "the following are equivalent"; i.e., the following statements are all joined by "iff" • Maxiquiz 4 today #### Week 5 • : • Read: §3.6 • Content: 1. going some recent HW and quizzes 2. more on finding bases of the row, column, and null spaces. 3. repeating the Rank-Nullity Theorem and mention its proof 4. discussion of the Fundamental Theorem of Invertible Matrices [which, remember, you were supposed to read carefully over the weekend] 5. defining a linear transformation • Miniquiz 12 today • : • [Re]Read: §3.6 • Content: 1. examples of linear transformations 2. left multiplication by an$m\times n$matrix as a linear transformation from$\RR^n$to$\RR^m$• Miniquiz 13 today • : • [Re]Read: §3.6 • Content: 1. finding the matrix of a linear transformation$\RR^n\to\RR^m$2. composition of linear transformations • Hand in I36 and HW6: 3.5.64 [hint: prove that every element of$\operatorname{col}(A+B)$is the sum of an element of$\operatorname{col}(A)$and an element of$\operatorname{col}(B)$; then explain why this suffices for the problem] and 3.6.{4, 8, 44} • Miniquiz 14 today • : • [Re]Read: §3.6 • Content: 1. domain, codomain, range 2. nullity and injectivity ("1-1ness") of a linear transformation • Maxiquiz 5 today #### Week 6 • : • [Re]Read: §3.6 • Content: 1. notation for the matrix of a linear transformation$T$will be$[T]$2. composition of linear transformations and the consequence for their matrices: if$S:\RR^n\to\RR^m$and$T:\RR^m\to\RR^p$are linear then the matrix$[T\circ S]$of composition$T\circ S:\RR^n\to\RR^p$satisfies$[T\circ S]=[T]*[S]$(where "$*$" means matrix multiplication) 3. remember: linear transformation$T:\RR^n\to\RR^m$is 1-1 if and only if$\operatorname{nullity}([T])=0$4. a linear transformation$T:\RR^n\to\RR^m$cannot be 1-1 if$n>m$5. if the linear transformation$T:\RR^n\to\RR^m$is invertible then$n$must equal$m$• no miniquiz today, alas... • : • Read: §4.2 • Content: 1. the inverse of a linear transformation, if it exists, is a linear transformation 2. definition of the determinant • for$1\times 1$matrices • for$2\times 2$matrices • recursively for$n\times n$matrices, where$n\ge 2$— which is essentially Laplace's Expansion Theorem • Hand in I37 and HW7: Chapter 3 Review Questions, p263: 14, 16, 18 • Miniquiz 15 today • : • [Re]Read: §4.2 • Content: 1. the determinant, using the definition we gave, is well-defined 2. if the square matrix$A$has as row or column of zeros, then$\det(A)=0$3. if the square matrix$A$is upper- or lower-triangular then$\det(A)$is the product of the diagonal elements of$A$4. if we get the matrix$B$by multiplying one of the rows of the square matrix$A$by the constant$k$, then$\det(B)=k\det(A)$. 5. if$A$is$n\times n$, then$\det(kA)=k^n\det(A)$6. determinants of elementary matrices, and of matrices before and after EROs are done to them • Miniquiz 16 today • : • [Re]Read: §4.2 • Content: 1. if$A$and$B$are$n\times n$matrices, then$\det(AB)=\det(A)\det(B)$— which is amazing, because matrix multiplication is highly non-commutative, while multiplication of real numbers is commutative, yet$\det(\cdot)$turns one into the other! 2. the determinant determines if a matrix is invertible: for an$n\times n$matrix$A$,$\det(A)=0\ \Leftrightarrow\ A$is invertible. • Maxiquiz 6 today #### Week 7 • : • Content: 1. going over Maxiquiz 6 2. some last details on theorems from last Friday 3. for an$n\times n$matrix$A$,$\det(A^T)=\det(A)$4. Review for Test I. See this review sheet • Hand in I38 and HW8: 4.2.{46, 54, 56, 69} • Miniquiz 17 today • : • Test I in class today. • : • Test I post-mortem. • no miniquiz today, alas... • : • Read: §4.1 & §4.3 • Content: 1. defining eigenvectors, eigenvalues, and eigenspaces 2. examples of$2\times 2$matrices with 0, 1, and 2 distinct eigenvalues 3. Theorem: If$\lambda$is an eigenvalue of the matrix$A$, then$\det(A-\lambda I_{n\times n})=0$. 4. an eigenspace is always a subspace of$\RR^n$; in fact, it is the nullspace of$A-\lambda I$. 5. the characteristic polynomial/equation of an$n\times n$matrix 6. the algebraic multiplicity of an eigenvalue 7. the geometric multiplicity of an eigenvalue 8. examples of eigenspaces and multiplicities of eigenvalues • no maxiquiz today, alas again... • Hand in Test I revisions, if you like. #### Week 8 • : • [Re]Read: §4.3 • Content: 1. eigenvalues of triangular matrices 2. eigenvalues of invertible matrices 3. eigenvectors corresponding to distinct eigenvalues are linearly independent 4. similarity of matrices, written$A\sim B$• Miniquiz 18 today • : • Read: §4.4 • Content: 1.${}\sim{}$is an equivalence relation 2. properties shared by matrices$A$and$B$if they are similar: • determinant • invertibility • rank • characteristic polynomial • eigenvalues 3. diagonalizable matrices 4.$A\in M_{n\times n}(\RR)$is diagonalizable iff$\RR^n$has a basis consisting of eigenvectors of$A$• Hand in I39 and HW9: 4.1.{24, 26, 37}, 4.3.{20, 24} • Miniquiz 19 today • : • [Re]Read: §4.4 • Content: 1. building a basis for the ambient$\RR^n$out of bases of all eigenspaces of an$n\times n$matrix • a$2\times 2$example 2. The Diagonalization Theorem 3. things to notice about an invertible matrix, say called$P$: • the columns of$P$are a basis of$\RR^n$, call them$\vec{p}_1,\dots,\vec{p}_n$• conversely, if$\{\vec{p}_1,\dots,\vec{p}_n\}$is a basis of$\RR^n$, and$P$is a matrix whose columns are these vectors$\vec{p}_j$, for$1\le j\le n$, then$P$is invertible$n\times n$. • multiplication on the left by$P$transforms the standard basis of$\RR^n$to the new basis consisting of the columns of$P$; that is, if$\vec{e}_1,\dots,\vec{e}_n$is the standard basis (so$\vec{e}_j$is really just the$j$th column of the$n\times n$identity matrix, for$1\le j\le n$), then$\vec{p}_j=P\vec{e}_j,$again for$1\le j\le n$. • multiplication on the left by$P^{-1}$transforms the basis$\{\vec{p}_1,\dots,\vec{p}_n\}$into the standard basis:$\vec{e}_j=P\vec{p}_j,$for$1\le j\le n$. 4. what this has to do with diagonalization: • if we can put together an entire basis$\{\vec{p}_1,\dots,\vec{p}_n\}$of$\RR^n$out of eigenvectors of some matrix$A$, so$A\vec{p}_j=\lambda_j\vec{p}_j$for$1\le j\le n$, then building the matrix$P$with columns from this basis, it will turn out that$P^{-1}AP$is diagonal, with$\lambda_1,\dots,\lambda_n$down the diagonal. [Why? because $$\left(P^{-1}AP\right)\vec{e_j}=P^{-1}\left(A\left(P\vec{e_j}\right)\right)=P^{-1}\left(A\vec{p_j}\right)=P^{-1}\left(\lambda_j\vec{p_j}\right)=\lambda_j\left(P^{-1}\vec{p_j}\right)=\lambda_j\vec{e}_j$$ so the standard basis of$\RR^n$consists of eigenvectors of$P^{-1}AP$with eigenvalues$\lambda_1,\dots,\lambda_n$... which means$P^{-1}AP$is diagonal as claimed.] 5. so we get the Diagonalization Theorem, two versions: • if separate bases of all the eigenspaces of an$n\times n$matrix$A$when put together yield a basis of all of$\RR^n$, then$A$is diagonalizable. • if the geometric multiplicity equals the algebraic multiplicity for every eigenvalue of a matrix$A$, then$A$is diagonalizable 6. a corollary of the Diagonalization Theorem is that if an$n\times n$matrix$A$has$n$distinct eigenvalues, then it is diagonalizable 7. examples of the practical process of diagonalizing a matrix we are given, following the above procedure 8. hints on the Jordan Canonical Form, examples of$2\times2$matrices • Miniquiz 20 today • : • Read: §5.1 • Content: 1. review of the Diagonalization Theorem and some examples 2. orthogonal and linearly independent sets of vectors in$\RR^n$3. an orthonormal basis [ONB] • Maxiquiz 7 today #### Week 9 • : • [Re]Read: §5.1 • Content: 1. going over Maxiquiz 7 2. coordinates with respect to an ONB — computing the coefficients by dot products 3. some convenient notation: the Kronecker delta$\delta_{ij}$. 4. definition of orthogonal matrices; notation$O(n)$for the set of such 5. finally, we define the notation$GL_n(\RR)$for the set of invertible$n\times n$matrices 6. alternative characterization of orthogonal matrices • Miniquiz 21 today • Hand in I310 and HW10: 4.3.{34, 36}, 4.4.{24, 40, 47} • : • [Re]Read: §5.1 • Content: 1. the effect multiplying by an orthogonal matrix on the dot product or the norm of vectors 2.$O(n)$is closed under products and inverses 3. determinants of orthogonal matrices 4.$GL_n(\RR)$and$O(n)$are groups, meaning they are sets on which there is defined an associative multiplication (so, the multiplication is closed: products of elements in the set stay in in the set), there is a multiplicative identity, and every element has a multiplicative inverse. 5. useful notation: "$\exists!x$..." means "there exists a unique$x$..." • Miniquiz 22 today • : • Read: §5.2 • Content: 1. defining orthogonal complement$W^\perp$(pronounced "W-perp") of a subspace$W\subseteq\RR^n$2. proved that for any subspace of$\RR^n$,$W^\perp$is a also a subspace of$\RR^n$3. for a matrix$A$: perps of column- and row-spaces, nullspaces and nullspaces of$A^T$(stated without proofs) 4. orthogonal projections 5. the Orthogonal Decomposition Theorem • Miniquiz 23 today • : • Read: §5.3 pp. 399-403 only • Content: 1. the Gramm-Schmidt Process • Maxiquiz 8 handed out today, it is due Monday • Hand in I311 and HW11: 5.1.{28, 37 [hint: make an appropriate orthogonal matrix and use Theorem 5.6c]}, 5.2.{24, 25} #### Week 10 • : • Read: §5.4 • Content: 1. defining orthogonally diagonalizable 2. starting the Spectral Theorem: orthogonally diagonalizable implies symmetric 3. example of orthogonally diagonalizing a symmetric matrix: key seems to be to find an ONB of$\RR^n$consisting of eigenvectors 4. mentioned (proof next time) that symmetric implies eigenspaces are orthogonal — this is the key step in the Spectral Theorem • hand in Maxiquiz 8 • Miniquiz 24 today • : • [Re]Read: §5.4 • Content: 1. even more on the Spectral Theorem 2. consequences of symmetry... distinct eigenspaces are orthogonal 3. example application of the Spectral Theorem: the sum of two orthogonally diagonalizable matrices is also orthogonally diagonalizable • Miniquiz 25 today • : • Read: §5.5 — pp.425-432 (the part called "Quadratic Forms") only • Content: 1. yet more on the Spectral Theorem 2. defining quadratic form 3. examples of quadratic forms: upward- and downward-pointing paraboloids and saddles 4. diagonalization of quadratic forms 5. brick-throwing demonstration: which is related to the inertia tensor, and applying the Spectral Theorem yields something called The Principal Axes Theorem in physics • Miniquiz 26 today • Hand in I312 and HW12: 5.3.{6, 8}, 5.4.{12, 14, 16} • : • Read: §6.1 • Content: 1. defining [abstract] vector space 2. starting examples (and non-examples) of vector spaces: • the trivial vector space$\{\vec{0}\}$•$\RR^n$with the usual vector addition and scalar multiplication •$\RR^2$with modified vector addition(s) is often not a vector space • spaces of functions, such as: •$\Ff(\RR)$— the space of all functions on the real line$\RR$•$C(\RR)$— the space of continuous functions on the real line$\RR$•$C^k(\RR)$for$k\in\NN$— the space of$k$times continuously differentiable functions on the real line$\RR$•$C^\infty(\RR)$— the space of infinitely differentiable functions on the real line$\RR$•$\Pp_k$for$k\in\NN$— the space of polynomials in one variable of degree at most$k$•$\Pp$— the space of all polynomials in one variable all with the pointwise addition of functions and scalar multiplications on functions •$M_{m\times n}$— the space of$m\times n$matrices, with the usual addition of matrices and scalar multiplication on matrices. • Maxiquiz 9 today #### Week 11 • : • [Re]Read: §6.1 • Content: 1. more discussion of the vector spaces of functions we defined last class -- these form a chain of subspaces:$\qquad\Pp_1\subseteq\Pp_2\subseteq\dots\subseteq\Pp\subseteq C^\infty(\RR)\subseteq\dots C^2(\RR)\subseteq C^1(\RR)\subseteq C(\RR)\subseteq\Ff(\RR)$2. some algebraic (arithmetic?) properties in vector spaces which are consequences of their definition, e.g.: • in any vector space$V$,$0\vec{u}=\vec{0}\ \ \forall\vec{u}\in V$• in any vector space$V$,$(-1)\vec{u}=-\vec{u}\ \ \forall\vec{u}\in V$• in any vector space$V$,$\alpha\vec{0}=\vec{0}\ \forall \alpha\in\RR$3. defining [vector] subspace 4. examples of subspaces 5. how to check if something is a subspace (it's closed under vector addition and scalar multiplication) • Miniquiz 27 today • : • [Re]Read: §6.1 and Read: §6.2 • Content: 1. more about vector subspaces, e.g., the$\vec{0}$in a subspace is the same vector as the$\vec{0}$in the ambient space 2. another example: the subspaces of symmetric or skew-symmetric matrices in$M_{n\times n}$3. defining Span in an abstract vector space 4. Span as an intersection 5. defining linearly [in]dependent in an abstract vector space, and examples • Hand in I313 and HW13: 5.5.{38, 54}, 6.1.{2, 6, 48} • Miniquiz 28 today • : • [Re]Read: §6.2 • Content: 1. defining basis in an abstract vector space, and examples 2. defining dimension in an abstract vector space, and examples 3. defining [in]finite dimensional for an abstract vector space, and examples 4. the Basis Theorem in an abstract vector space 5. more on linear independence, particularly in infinite dimentional vector spaces • Miniquiz 29 today • : • [Re]Read: §6.2 and Read: §6.3 • Content: 1. coordinates with respect to a basis in an abstract vector space 2. the change of basis matrix$P_{\Cc\leftarrow\Bb}$• Maxiquiz 10 today #### Week 12 • : • Content: 1. going over Maxiquiz 10 2. Review for Test II. See this review sheet • Hand in I314 and HW14: 6.2.{34, 44}, 6.3.{12, 16} • : • Test II in class today. • : • Test II post-mortem. • no miniquiz today, alas... • : • Read: §6.4 • Content: 1. a linear transformation between abstract vector spaces 2. examples of linear tranformations: • the zero transformation • the identity transformation • matrix multiplication • differentiation in$\Pp$3. properties of linear tranformations: • they map$\vec{0}_V$to$\vec{0}_W$• they behave nicely with respect to the "additive inverse" operation$\vec{v}\mapsto-\vec{v}$. • compositions of LTs are LTs • no maxiquiz today, alas again... • Hand in Test II revisions, if you like. #### Week 13 • : • [Re]Read: §6.4 and Read: §6.5 • Content: 1. linear transformations are determined by what they do to a basis, which is, however, completely free: that is, if we want to make a linear transformation$T:V\to W$, and if$\{\vec{v}_1,\dots,\vec{v}_n\}$is a basis of$V$, we can choose any vectors$\vec{w}_1,\dots,\vec{w}_n$we like in$W$, and there will be a unique linear$T$which satisifies $$T(\vec{v}_1)=\vec{w}_1,\quad\dots,\quad T(\vec{v}_n)=\vec{w}_n\ \ .$$ 2. more examples of linear tranformation 3. the kernel of a linear transformation 4. the range of a linear transformation 5. one-to-one (or 1-1 or injective) 6. onto (or surjective) 7. inverses of linear transformations • Miniquiz 30 today • : • [Re]Read: §6.5 • Content: 1. kernels and ranges are always vector subspaces 2. rank and nullity (again) 3. The Rank-Nullity Theorem (again) 4. isomorphisms and isomorphic (written "$\cong$") 5. a theorem giving a necessary and sufficient condition for finite dimensional vector spaces to be isomorphic • Miniquiz 31 today • : • Read: §6.6 • Content: 1. defining the matrix of a linear transformation with respect to bases of its domain and codomain 2. the matrix of a composition of linear transformations 3. the matrix of the inverse of a linear transformations 4. matrices of endomorphisms of vector spaces and similarity using the change of basis matrix • Miniquiz 32 today • : • Read: §7.1 • Content: 1. an inner product and inner product space 2. examples of inner product spaces: •$\RR^n$with the usual dot product • the$L^2$inner product on$C([0,1])$•$L^2$also works on$\Pp$and$\Pp_n$($n\in\NN$) • another inner product on$\Pp_n$:$\lt p,q\gt=p(0)q(0)+\dots+p(n)q(n)$(just using the first term$p(0)q(0)$works in all parts of the definition of an inner product except for non-degeneracy; with all$n+1$terms it becomes a full inner product ... which we know because of the Fundamental Theorem of Algebra!) • Hand in I315 and HW15: 6.4.{22, 24}, 6.5.{27, 34}, 6.6.16 • Maxiquiz 11 today #### Week 14 • Thanksgiving Break! No classes, of course. #### Week 15 • : • [Re]Read: §7.1 • Content: 1. elementary properties of inner products 2. length or norm of vectors in an inner product space 3. the Pythagorean Theorem 4. orthogonal vectors in an inner product space 5. projections and the Gram-Schmidt Process in an inner product space 6. an orthonormal set in the inner product space$C([-\pi,\pi])$with the$L^2$inner product: trigonometric functions, and the connection with Fourier Analysis and radios • Miniquiz 33 today • : • Read: §7.3 pp.591-604 only • Content: 1. the best approximation of a vector$\vec{v}$in a subspace$W\$
2. The Best Approximation Theorem
3. the least squares approximation
4. a least squares solution of a linear system
• Miniquiz 34 today
• :
• Content:
1. the normal equations corresponding to a given linear system
2. The Least Squares Theorem
3. examples/application of least squares solutions
• Miniquiz 35 today
• :
• Content:
1. going over recent HWs
2. review for the Final Exam next week; see this review sheet
• Hand in I316 and HW16: 7.1.{16, 34, 40}, 7.3.{8, 16, 26}
• Maxiquiz 12 today

#### Week 16

• Exam week, no classes.
• :
• Tody is the last day to hand in all late work and re-dos for class credit. You can also pick up the graded HW16. There will be extra office hours all day today, as well, but please make an appointment for a specific time if you know when it will be (if you don't, just drop by anyway).
• :
• FINAL EXAM PART ONE from 8-10:20am in our usual classroom
• :
• FINAL EXAM PART TWO from 10:30am-12:50pm in our usual classroom