## Colorado State University, Pueblo; Spring 2012 Math 307 — Introduction to Linear Algebra Course Schedule & Homework Assignments

Here is a link back to the course syllabus/policy page.

In the following, all sections and page numbers refer to the required course textbook, Linear Algebra, A Modern Introduction (2nd edition), by David Poole.

Also in the following, the image means that class was videoed that day and can be seen through the Blackboard page for this class. Note that if you know ahead of time that you will miss a class, you should tell me and I will be sure to video that day.

This schedule is will be changing very frequently, please check it at least every class day, and before starting work on any assignment (in case the content of the assignment has changed).

• M:
• Content:
1. bureaucracy and introductions
2. what is Linear Algebra (the study of vector spaces and linear transformations...)
3. why do we do so much abstraction and formality at this point of the mathematics curriculum: what abstraction is good for
• Miniquiz 0
• T:
• Read: To the Student, p. xxiii and §§ 1.1–1.3
• Miniquiz 1
• Journals are not due today; it's a bit too early in the term. (First Journal entry will be due next Tuesday.)
• Content:
1. some basic terminology and notation:
• logical and basic set theoretic terminology/notation
• some basic sets of numbers
1. natural numbers $\NN$
2. integers $\ZZ$
3. rationals $\QQ$
4. real numbers $\RR$
• starting good definitional style, including
• all variables must be "bound"
• clearly identify the symbol and/or terminology being defined
• clearly identify the type of object being defined
• vectors in $\RR^n$
• scalar multiplication
• the dot product
• norms
2. some basic properties
• of vector arithmetic
• of dot products and norms
• the triangle inequality
• Do HW0: Send me e-mail (to jonathan.poritz@gmail.com) telling me:
2. Your e-mail address. (Please give me one that you actually check fairly frequently, since I may use it to contact you during the term.)
4. The reason you are taking this course.
5. What you intend to do after CSUP, in so far as you have an idea.
6. Past math classes you've had.
7. Other math and science classes you are taking this term, and others you intend to take in coming terms.
10. Anything else you think I should know (disabilities, employment or other things that take a lot of time, etc.)
11. [Optional:] If you were going to be trapped on a desert island alone for ten years, what music would you like to have?
• W:
• Read: §§ 2.1 & 2.2
• Miniquiz 2
• Content:
1. more basic terminology and notation...
• complex numbers
• angles between vectors
• quantifiers $\forall$ and $\exists$
• more on what makes a good defintion
• [systems of] linear equations
• solutions of linear systems, the solution set and its structure
• a[n in]consistent linear system
• the coefficient and augmented matrices of a linear system
• elemenatry row operations (EROs)
• row-equivalent matrices
• [in]homogeneous linear systems
2. more basic properties
• solution sets of linear systems are either empty, have exactly one point, or have an infinite number of points
• F:
• Content:
1. yet more basic terminology and notation...
• angles between vectors
• orthogonal vectors
• a [non]trivial solution of a linear system
• relation between [non]trivial solutions, [non]homogeneous linear systems, and [non]unique solutions
• more on what makes a good defintion
• what makes a good statement of a result (theorem, propostion, etc.)
• [starting on] what makes a good proof
• proof structures:
2. contradiction ("it can't not be true")
3. many more to come...
• the [reduced] row-echelon form of a matrix
• row-reduction
• free variables
• the rank of a matrix
2. more basic properties
• The Rank-Nullity Theorem
• Maxiquiz 1 handed out today, due on Monday
• Today [Friday] is the last day to add classes.

• M:
• Content:
1. going over Maxiquiz 2 — the moral was write down the definitions, it's often enough!
2. recall from Math 207 and your reading of the book the basic terminology:
• matrices
• scalar multiplication with matrices
• matrix multiplication
• the identity matrix
• matrix algebra
3. in class we recalled the definition of transpose
4. starting the proof (by induction!) that the transpose of the sum of $k$ matrices is the sum of the transposes of those matrices, $\forall k$.
• Miniquiz 6
• Today [Monday] is the last day to drop classes without a grade being recorded
• T:
• Content:
1. finishing the induction proof that the transpose of the sum of $k$ matrices is the sum of the transposes of those matrices, for any $k$.
2. how to submit electronic Journal entries and HWs.
3. some work with matrix multiplication:
• a very condensed version of the definition
• remember: this operation is very rarely commutatitve
4. stated without proof that the transpose of the product of $k$ square matrices is the product in the opposite order of the transposes of those matrices; noted that a proof of this would also use induction.
5. [skew-]symmetric matrices, properties:
• symmetric and skew-symmetric matrices must be square
• a skew-symmetric matrix always has zeros on the diagonal
• Hand in HW3: 3.1.37
• Hand in (or submit electronically) your Journal 1 on §§1.1-1.3, 2.1, & 2.2 and Journal 2 on §§2.3, 3.1, & 3.2. See this link (or Blackboard) for more info on what is expected of you and how to do it.
• Miniquiz 7
• W:
• Content:
1. going over HW2
• style is important in proofs! ... Be stylish!
• it helps to have a guess as to what you want to prove, then simply to write down all the definitions and see how they relate to each other (the unpacking strategy, after you decide what you want to unpack)
2. a very logical day:
• the details of an if–then statement, i.e., one in the form $P\Rightarrow Q$
• the converse of $P\Rightarrow Q$ (which is $\neg P\Rightarrow\neg Q$).
• the contrapositive of $P\Rightarrow Q$ (which is $\neg Q\Rightarrow\neg P$).
• the negation of $P\Rightarrow Q$ (which is $P\land\neg Q$)
• if $P\Rightarrow Q$ is true, the converse may or may not be true
• an "if-then" statement is true if and only if its contrapositive is true
• the negation of a statement of the form $\forall x\ P(x)$ is $\exists x\ \neg P(x)$
• the negation of a statement of the form $\exists x\ P(x)$ is $\forall x\ \neg P(x)$
• therefore, discussing what is the negation of the statement $\forall a,b,c\in\RR\ (a\vec{u}+b\vec{v}+c\vec{w}=\vec{0}\Rightarrow a=b=c=0)$
• F:
• Hand in HW4: 3.2.33 and 3.3.46
• Content:
1. going over some recent HW and miniquizzes
2. discussion of how thinking of and writing down proofs is a serious new skill we are working on in this class, so we should definitely not expect it to be terrifically easy and or to come quickly
3. definition of the inverse of a matrix, and what it means for a matrix to be invertible
4. proved the Theorem: Given $A$ and $B$ invertible matrices, $A\,B$ will be invertible, and $(A\,B)^{-1}=B^{-1}\,A^{-1}$.
• Maxiquiz 3 today

• M:
• Content:
1. going over Maxiquiz 4
2. the null space of a matrix
3. nullity
4. The Rank[-Nullity] Theorem
• Hand in HW6: 3.5.56, 3.5.60 (these are serious problems — don't hesitate to contact me and ask for a hint).
• T:
• Content:
1. The Basis Theorem
2. coordinates w.r.t. a basis (what amounts to Theorem 3.29 in §3.5)
3. a linear transformation
4. examples of linear transformations:
• the trivial transformation which sends every input to $\vec{0}$.
• some random formulæ
• rotations of $\RR^2$
• reflections of $\RR^2$
5. a Proposition: If $f:\RR^n\to\RR^M$ is a linear transformation, then $f(\vec{0})=\vec{0}$.
• Journal 3 on §3.5 is due today
• Miniquiz 10
• W:
• Content:
1. more examples of linear transformations
2. a linear transformation coming from matrix multiplication — what the book (and almost no one else) calls a "matrix transformation"
3. a linear transformation is determined by what it does to a basis
4. the matrix $[T]$ of a linear transformation $T$, with examples:
• a counterclockwise rotation of $\RR^2$ by the angle $\theta$ has matrix $\begin{pmatrix}\cos\theta&\sin\theta\\ -\sin\theta&\cos\theta\end{pmatrix}$.
• a reflection of $\RR^2$ across the $y$-axis has matrix $\begin{pmatrix}-1&0\\ 0&1\end{pmatrix}$ .
• projection onto the $x$-axis in $\RR^2$ has matrix $\begin{pmatrix}1&0\\ 0&0\end{pmatrix}$ .
5. composition of linear transformations
• it's linear, too
• its matrix is the product of the matrices of the consituent transformations
• Miniquiz 11
• F:
• Content:
• the determinant determines if a matrix is invertible: $\det(A)=0\ \Leftrightarrow\ A$ is invertible.
• $\forall A,B,\ \det(AB)=\det(A)\cdot\det(B)$ — which is amazing, because matrix multiplication is highly non-commutative, while multiplication of real numbers is commutative, yet $\det(\cdot)$ turns one into the other
• Maxiquiz 5 today
• Hand in HW7: 3.6.4, 3.6.8, 3.6.44

• M:
• Content:
1. going over Maxiquiz 5
2. the identity transformation of $\RR^n$
3. the inverse of a linear/matrix transformation
4. definition of the determinant
• for $1\times 1$ matrices
• for $2\times 2$ matrices
• recursively for $n\times n$ matrices, where $n\ge 3$ — which is essentially Laplace's Expansion Theorem
• [there is actually a direct (=non-recursive) formula]
5. properties of determinants:
• $\det(A)=0\ \Leftrightarrow\ A$ is invertible.
• $\forall A,\ \det(A)=\det(A^T)$
• $\forall A,B,\ \det(AB)=\det(A)\cdot\det(B)$
• determinants for triangular matrices
• Miniquiz 12
• T:
• Content:
1. eigenvectors, eigenvalues, and eigenspaces
2. an eigenspace is always a subspace of $\RR^n$; in fact, it is the nullspace of $A-\lambda I$.
3. the characteristic polynomial/equation of an $n\times n$ matrix
4. the algebraic multiplicity of an eigenvalue
• Journal 4 on §§3.6 & 4.2 is due today
• Hand in HW8: 4.2.53, 4.2.54, 4.2.69
• Miniquiz 13
• W:
• Content:
1. the geometric multiplicity of an eigenvalue
2. definition of matrix similarity
3. examples of eigenspaces and multiplicities of eigenvalues
• Miniquiz 14
• F:
• Content:
1. eigenvalues of triangular matrices
2. eigenvalues of invertible matrices
3. linear independence of eigenvectors corresponding to distinct eigenvalues
4. properties of matrix similarity:
• reflexive
• symmetric
• transitive
• ...so it's an equivalence relation
5. properties in common to similar matrices
6. diagonalizable matrices
7. building a basis for the ambient $\RR^n$ out of bases of all eigenspaces of an $n\times n$ matrix
• a $2\times 2$ example
8. The Diagonalization Theorem
• Maxiquiz 6 today
• Hand in HW9: 4.1.35, 4.1.37, 4.3.20-22

• M:
• T:
• Hand in Journal 5, if you are doing it on paper
• Hand in HW10: 4.4.40-42, 4.4.47-48
• Midterm I in class today
• W:
• Content:
1. going over Midterm I
• F:
• Content:
1. things to notice about an invertible matrix, say called $P$:
• the columns of $P$ are a basis of $\RR^n$, call them $\vec{p}_1,\dots,\vec{p}_n$
• conversely, if $\{\vec{p}_1,\dots,\vec{p}_n\}$ is a basis of $\RR^n$, and $P$ is a matrix whose columns are these vectors $\vec{p}_j$, for $1\le j\le n$, then $P$ is invertible $n\times n$.
• multiplication on the left by $P$ transforms the standard basis of $\RR^n$ to the new basis consisting of the columns of $P$; that is, if $\vec{e}_1,\dots,\vec{e}_n$ is the standard basis (so $\vec{e}_j$ is really just the $j$th column of the $n\times n$ identity matrix, for $1\le j\le n$), then $\vec{p}_j=P\vec{e}_j,$ again for $1\le j\le n$.
• multiplication on the left by $P^{-1}$ transforms the basis $\{\vec{p}_1,\dots,\vec{p}_n\}$ into the standard basis: $\vec{e}_j=P\vec{p}_j,$ for $1\le j\le n$.
2. what this has to do with diagonalization:
• if we can put together an entire basis $\{\vec{p}_1,\dots,\vec{p}_n\}$ of $\RR^n$ out of eigenvectors of some matrix $A$, so $A\vec{p}_j=\lambda_j\vec{p}_j$ for $1\le j\le n$, then building the matrix $P$ with columns from this basis, it will turn out that $P^{-1}AP$ is diagonal, with $\lambda_1,\dots,\lambda_n$ down the diagonal. [Why? because $$\left(P^{-1}AP\right)\vec{e_j}=P^{-1}\left(A\left(P\vec{e_j}\right)\right)=P^{-1}\left(A\vec{p_j}\right)=P^{-1}\left(\lambda_j\vec{p_j}\right)=\lambda_j\left(P^{-1}\vec{p_j}\right)=\lambda_j\vec{e}_j$$ so the standard basis of $\RR^n$ consists of eigenvectors of $P^{-1}AP$ with eigenvalues $\lambda_1,\dots,\lambda_n$ ... which means $P^{-1}AP$ is diagonal as claimed.]
3. so we get the Diagonalization Theorem, two versions:
• if separate bases of all the eigenspaces of an $n\times n$ matrix $A$ when put together yield a basis of all of $\RR^n$, then $A$ is diagonalizable.
• if the geometric multiplicity equals the algebraic multiplicity for every eigenvalue of a matrix $A$, then $A$ is diagonalizable
4. a corollary of the Diagonalization Theorem is that if an $n\times n$ matrix $A$ has $n$ distinct eigenvalues, then it is diagonalizable
• Hand in revised solutions to Midterm I, if you like

• M:
• Content:
4. our throwing bricks demo from last week is related to the inertia tensor, and applying the Spectral Theorem yields something called The Principal Axes Theorem in physics
• T:
• Content:
1. defining vector space
2. starting examples (and non-examples) of vector spaces:
• the trivial vector space $\{\vec{0}\}$
• $\RR^n$ with the usual vector addition and scalar multiplication
• $\RR^2$ with modified vector addition(s) is often not a vector space
• Start HW15, which is due tomorrow
• Miniquiz 22
• W:
• Content:
1. some algebraic (arithmetic?) properties in vector spaces which are consequences of their definition (e.g., in any vector space $V$, $0\vec{u}=\vec{0}\ \forall u\in V$).
2. more examples of vector spaces:
• spaces of functions, such as:
• $\Ff(\RR)$ — the space of all functions on the real line $\RR$
• $C(\RR)$ — the space of continuous functions on the real line $\RR$
• $C^k(\RR)$ for $k\in\NN$ — the space of $k$ times continuously differentiable functions on the real line $\RR$
• $C^\infty(\RR)$ — the space of infinitely differentiable functions on the real line $\RR$
• $\Pp_k$ for $k\in\NN$ — the space of polynomials in one variable of degree at most $k$
• $\Pp$ — the space of all polynomials in one variable
all with the pointwise addition of functions and scalar multiplications on functions
• $M_{m\times n}$ — the space of $m\times n$ matrices, with the usual addition of matrices and scalar multiplication on matrices.
• Hand in HW15: 5.5.32, 5.5.38, 5.5.42
• Miniquiz 23
• F:
• Content:
1. more discussion of the vector spaces of functions we defined last class -- these form a chain of subspaces:
$\qquad\Pp_1\subseteq\Pp_2\subseteq\dots\subseteq\Pp\subseteq C^\infty(\RR)\subseteq\dots C^2(\RR)\subseteq C^1(\RR)\subseteq C(\RR)\subseteq\Ff(\RR)$
2. defining [vector] subspace
3. examples of subspaces
4. how to check if something is a subspace
• Maxiquiz 9 today

• Spring Break! No classes, of course.
• But please catch up on any old homeworks or re-dos of old assignments, to hand in on Monday after the break.
• Also, do not forget that HW16 is due the first day after break, and Journal 7 the day after that

• M:
• Content:
1. going over Maxiquiz 9 and recent HWs
2. still more about vector subspaces
3. another vector space example: $M_{m\times n}$ — the space of $m\times n$ matrices, with the usual addition of matrices and scalar multiplication on matrices.
4. the subspaces of symmetric or skew-symmetric matrices in $M_{n\times n}$
• Hand in HW16: 6.1.{2, 6, 12, 48}
• Miniquiz 24
• T:
• Content:
1. defining Span in an abstract vector space
2. properties of Span in an abstract vector space
3. defining linearly [in]dependent in an abstract vector space, and examples
4. defining basis in an abstract vector space, and examples
5. defining dimension in an abstract vector space, and examples
• Journal 7 on §§5.4, 5.5 (only pp. 411-418), & 6.1 is due today
• Miniquiz 25
• W:
• Content:
1. a little more care with finite and infinite sets of vectors which might be linearly [in]dependent
2. defining [in]finite dimensional for an abstract vector space, and examples
3. the Basis Theorem in an abstract vector space
• Miniquiz 26
• F:
• Content:
1. yet more care with finite and infinite sets of vectors which might be linearly [in]dependent
2. be careful of the definition of dimension in the book, it doesn't quite work!
3. coordinates of a vector $\vec{v}$ with respect to a basis $\Bb$ in an abstract vector space: $[\vec{v}]_\Bb\in\RR^k$, if $\Bb$ consists of $k$ basis vectors
• Hand in HW17: 6.1.46, 6.2.{6, 34, 44}
• Maxiquiz 10 today

• M:
• Content:
1. going over Maxiquiz 10 and recent HWs
• a change-of-basis matrix $P_{\Cc\leftarrow\Bb}$ for converting from basis $\Bb$ to basis $\Cc$, by $[\vec{v}]_\Cc=P_{\Cc\leftarrow\Bb}\,[\vec{v}]_\Bb$
• this $P_{\Cc\leftarrow\Bb}$ will be $k\times k$ if $\Bb$ (and thus also $\Cc$) consists of $k$ vectors
• $P_{\Cc\leftarrow\Bb}$ is invertible: in fact, $P_{\Cc\leftarrow\Bb}^{-1}=P_{\Bb\leftarrow\Cc}$
• Miniquiz 27
• T:
• Content:
1. more examples of change-of-basis matrices
2. definition of a linear tranformation between abstract vector spaces
• Miniquiz 28
• W:
• Content:
1. examples of linear tranformations
• the zero transformation
• the identity transformation
• matrix multiplication
• differentiation in $\Pp$
2. review for Midterm II; see this review sheet
• Hand in HW18: 6.3.16, 6.3.21 [Hint: use Thm 6.12b.], 6.4.{20, 22, 24}
• Miniquiz 29 (handed out as a take-home quiz, due Friday)
• F:
• Midterm II in class today
• Journal 8 on §§6.2-6.4 is due today

• M:
• Content:
1. going over Midterm II
• T:
• Content:
1. linear transformations are determined by what they do to a basis, which is, however, completely free: that is, if we want to make a linear transformation $T:V\to W$, and if $\{\vec{v}_1,\dots,\vec{v}_n\}$ is a basis of $V$, we can choose any vectors $\vec{w}_1,\dots,\vec{w}_n$ we like in $W$, and there will be a unique linear $T$ which satisifies $$T(\vec{v}_1)=\vec{w}_1,\quad\dots,\quad T(\vec{v}_n)=\vec{w}_n\ \ .$$
2. composition of linear transformations
3. inverses of linear transformations
• Hand in revised solutions to Midterm II, if you like
• W:
• Content:
1. last words about inverses of lienar transformations
2. the kernel of a linear transformation
3. the range of a linear transformation
4. one-to-one (or 1-1 or injective)
5. onto (or surjective)
• Miniquiz 30
• F:
• Content:
1. kernels and ranges are always vector subspaces
2. rank and nullity (again)
3. The Rank-Nullity Theorem (again)
4. isomorphisms and isomorphic
• Hand in HW19: 6.4.{26, 32}, 6.5.{4, 24, 27, 34}
• Maxiquiz 11 today

• M:
• Content:
1. going over Maxiquiz 11
2. a theorem giving a necessary and sufficient condition for finite dimensional vector spaces to be isomorphic
3. an inner product and inner product space
4. examples of inner product spaces:
• $\RR^n$ with the usual dot product
• the $L^2$ inner product on $C[0,1]$
• Journal 9 on §§6.5 & 6.6 is due today
• Miniquiz 31
• T:
• Content:
1. elementary properties of inner products
2. length or norm of vectors in an inner product space
3. the Pythagorean Theorem
4. orthogonal vectors in an inner product space
5. projections and the Gram-Schmidt Process in an inner product space
6. an orthonormal set in the inner product space $C[-\pi,\pi]$ with the $L^2$ inner product: trigonometric functions, and the connection with Fourier Analysis and radios
• Miniquiz 32
• W:
• Content:
1. the distance between vectors in an inner product space
2. a norm and a normed linear space
3. the distance function $d(\vec{u},\vec{v})$ in a normed linear space
4. a metric and metric space
5. examples of norms/metrics:
• the norm coming from an inner product
• the sum norm (also called the $L^1$ norm)
• the max norm (also called the sup or $L^\infty$ norm)
• the taxicab metric
• Hand in HW20: 7.1.{36, 40}, 7.2.{8, 14}
• Journal 10 on §§7.1 & 7.2 [only the parts we covered in this class] is due today
• Miniquiz 33
• Hand in all late work and re-dos by 4pm today if you want them corrected and returned to you on Friday
• F:
• Content:
1. going over recent HWs
2. review for the Final Exam next week; see this review sheet
• Last day to hand in [by noon!] all late work and re-dos for class credit [although materials handed in today will not be returned]