Math 307 — Introduction to Linear Algebra

Course Schedule & Homework Assignments

Here is a link back to the course syllabus/policy page.

This schedule is will be changing **very frequently**, please check it at
least every class day, and before starting work on any assignment (in case the
content of the assignment has changed).

In the following all reading assignments, sections, and page numbers refer to
the required course textbook, * Linear Algebra, A Modern Introduction
(3^{rd} edition)*, by David Poole, unless
otherwise specified.

If you see the symbol below, it means that class was videoed and you can get a link by e-mailing me. Note that if you know ahead of time that you will miss a class, you should tell me and I will be sure to video that day for you.

When there is a reading assignment, please read the named section(s) **before
that day**.

Homework for a particular day is **due that day**, either in class or
handed in at my office **by 3pm**.

*:*- bureaucracy and introductions.
- what is Linear Algebra (the study of vector spaces and linear transformations...)
- why do we do so much abstraction and formality at this point of the mathematics curriculum: what abstraction is good for
- an example of an application of linear algebra: the $419
billion eigenvalue problem:
**Google**'s**PageRank**algorithm - Read the course syllabus and policy page.
**HW0**(at*Send me e-mail*`jonathan@poritz.net`) telling me:- Your name.
- Your e-mail address. (Please give me one that you actually check fairly frequently, since I may use it to contact you during the term.)
- Your year/program/major at CSUP.
- What you intend to do after CSUP, in so far as you have an idea.
- Past math classes you've had.
- The reason you are taking this course.
- Your favorite mathematical subject.
- Your favorite mathematical result/theorem/technique/example/problem.
- Anything else you think I should know (disabilities, employment
or other things that take a lot of time,
*etc.*). - [Optional:] The name of a good book you have read recently.

**Miniquiz 0 today**

*:***Read:***To the Student, p. xxv&xxvi*and §§ 1.1&1.2*Content:*- logical and basic set theoretic terminology/notation
- some basic sets of numbers
**natural numbers**$\NN$**integers**$\ZZ$**rationals**$\QQ$**real numbers**$\RR$

- starting good definitional style, including
variables must be "bound"**all**- clearly identify the symbol and/or terminology being defined
- clearly identify the type of object being defined

**quantifiers**$\forall$ and $\exists$**vectors in $\RR^n$****vector addtion****scalar multiplication**- the
**dot product** **norms**- some basic properties
- of vector arithmetic
- of dot products and norms
- the triangle inequality
- the
**Cauchy-Schwartz Inequality**

**Miniquiz 1 today**

*:***[Re]Read:**§§ 1.1 & 1.2*Content:***linear combinations of vectors****angles between vectors****orthogonal**vectors, the notation $\vec{v}\perp\vec{w}$**unit vectors****distances between vectors****projection of one vector onto another**- proof that $\operatorname{proj}_{\vec{u}}(\vec{v})$ is the vector on the line along the vector $\vec{u}$ which is closest to $\vec{v}$.

**Miniquiz 2 today**

*:***[Re]Read:**§§ 1.1-1.3 & 2.1*Content:*- going over past miniquizzes and things like the HW problems:
- always bind variables, use quantifiers when appropriate
(
*e.g.,*often $\forall$ is in properties, like commutativity, associativity,*etc.*). - restate the problem more precisely if it helps

- always bind variables, use quantifiers when appropriate
(
- some other proofs from the exercises of §1.2 — practice
proof-writing.
*e.g.,*proof that $d(\vec{u},\vec{v})=0$ iff $\vec{u}=\vec{v}$. - the phrase "
*if and only if*," and synonyms "*iff*" and "$\Leftrightarrow$" **linear equations**and**systems of linear equations****solutions of linear systems**- examples of linear systems with no solutions, a unique solution, and an infinite number of solutions

- going over past miniquizzes and things like the HW problems:
**Maxiquiz 1 today**- Hand in
**I**and^{3}1**HW1**: 1.2.{34, 60, 62} **Today [Friday] is the last day to add classes.**

*:***[Re]Read:**§2.1 and**Read**§2.3*Content:***[in]consistent linear system**- the
**solution set**of a linear system and its description as intersecting planes - the
**coefficient**and**augmented matrices**of a linear system - relationship between a linear system being
*homogeneous*and being*consistent* **linear combination**[again]**Span**, a few basic examples and properties (*e.g.,*the span of a single vector is the line along that vector)- result that a linear system is consistent iff the vector consisting of the right hand side constant values is in the span of the columns

**Miniquiz 3 today**

*:***[Re]Read:**§2.3*Content:***[non]trivial**for both*linear combinations*and*solutions of a linear system*- some elementary facts about $\Span$ such as that each of the vectors $\vec{0},\vec{v}_1,\dots,\vec{v}_k$, is in the set $\Span(\vec{v}_1,\dots,\vec{v}_k)$.
**linearly [in]dependent**vectors — note it is important that the scalars in the definition are*not all zero*- the set $\{\vec{0}\}$ is linearly dependent — in fact, any collection of vectors containing the zero vector is linearly dependent.

**Miniquiz 4 today**

*:***[Re]Read:**§2.3*Content:*- if $\{\vec{v_1},\dots,\vec{v_k}\}$ is a linearly dependent set, then one of the $\vec{v_j}$ is linear combination of the remaining vectors.
- more on proofs
**by contradiction**. the structure:- You want to prove $P\Rightarrow Q$.
- You assume $P$.
- You assume $\sim Q$ (read that "
*not*$Q$") - You use those assumptions, logic, calculations, unpacking
definitions, prior theorems,
*etc.*, to derive a*contradiction*— some statement $S$ and $\sim S$ being simultaneously true. We write "$\Rightarrow\Leftarrow$" when the contradiction is found. - At this point, the last assumption you made must be false. Since that assumption was $\sim Q$, it must be instead that $Q$ is true.

**Miniquiz 5 today**- Hand in
**I**and^{3}2**HW2**: 1.2.70, 2.2.44, 2.3.{20, 44}

*:***Read:**§§2.2 & 2.3*Content:*- some mathematical logic: Start with a statement $S$ of the form
$P\Rightarrow Q$. Then we define
- the
**converse**of $S$ is the statement $Q\Rightarrow P$ - the
**contrapositive**of $S$ is the statement $\sim Q\Rightarrow\sim P$

- The truth of the converse is completely independent of the truth the original statement: either one could be true or false without affecting the other.
- The contrapositive of a statement is logically equivalent to
the statement.
*I.e.,*$S$ is true if and only if its contrapositive is true. Note: this is commonly used logic even in real life. E.g.: it is true that if the Sun goes supernova today, then the Earth will be vaporized tomorrow. Equivalently, if the Earth, tomorrow, is not vaporized, then the Sun must not have gone supernova today!

- the
- proof that if $\{\vec{v_1},\dots,\vec{v_k}\}$ is a linearly independent set, then any subset is also linearly independent -- by proving the contrapositive!
- defined terms:
**elementary row operation**(applied to a matrix), shortened to**ERO**.- a matrix $M$ may be in
**row-echelon form**, shortened to "$M$ is**REF**." - a matrix $M$ may be in
**reduced row-echelon form**, shortened to "$M$ is**RREF**."

- some mathematical logic: Start with a statement $S$ of the form
$P\Rightarrow Q$. Then we define
**Maxiquiz 2 today**... which became a**take-home quiz**. Please take this seriously: no consultation with others (including your textbook and the Internet), sit down and work on it at one go — do not work for a while, get up, do other things, come back and do more,*etc.*

*:*- Yes,
**we do have class today**, even though it is the federal holiday celebrating the achievements of workers and the labor movement — if you like the 40 hour work week, now is the time to give thanks. **Maxiquiz 2**is due at the beginning of class.**Today [Monday] is the last day to drop classes without a grade being recorded.****Read:**§2.2*Content:***row reduction****free variables****rank**- the
**Rank Theorem** - the relationship of
**linear [in]dependence**and the linear system whose coefficient matrix has columns which are the vectors under consideration: there will be a non-trivial solution of the homogeneous linear system with that coefficient matrix if and only if the vectors are linearly dependent - a term is
**well-defined**if any choices in its definition are mentioned explicitly *rank*is well defined because matrices always do have an RREF form, and that form is unique; they also always do have an REF form, but that is not unique.- necessary linear dependence of $k$ vectors in $\RR^n$ if $k>n$
**context**and**type**for definitions ... see the handout on definitions for more on this theme

**Miniquiz 6 today**

- Yes,
*:***Read:**§§3.1 & 3.2*Content:*- definitions for matrices
- equality for matrices
- addition/subtraction for matrices
- scalar multiplication for matrices
- matrix multiplication —
**not commutative!** - the
**identity matrix** **elementary matrices**- transpose of matrices; transposes and multiplication
- [skew-]symmetric matrices
- properties of matrix operations

- Hand in
**I**and^{3}3**HW3**: 2.3.48, Chapter Review Exercises p141: 14, 16, 18 **Miniquiz 7 today**

*:***Read:**§§3.3*Content:*- the
**inverse**of a matrix - an
**invertible**matrix*vs***non-invertible**or**singular** - uniqueness of inverses
- inverses of elementary matrices
- inverses of $2\times2$ matrices
- inversions and transposition
- inverses and solving linear systems
- the book's
**Fundamental Theorem of Invertible Matrices**

- the
**Miniquiz 8 today**

*:***Read:**§§3.3 & 3.5*Content:*- inverse of a product of two matrices
- how about more matrices? We'll need:
**Proof by induction**, or proofs using the**Principle of Mathematical Induction**, which goes like this:- It only applies to theorems of the specific form "$\forall n\in\NN\ S(n)$ is true," where $S(n)$ is a mathematical statement which depends upon a natural number parameter $n$.
- First one proves that $S(1)$ is true; this is called the
**base case**. - Then one proves "If $S(n)$, then $S(n+1)$"; this is called
the
**inductive step**and, during the proof of this step, when one invokes the hypothesis $S(n)$, one calls it the**inductive hypothesis**. - One declares the theorem proven
**by induction**(and goes home happy).

- an example of an inductive proofs, to show that $\sum_{j=1}^n j = \frac{n(n+1)}{2}$.
- another example: proving that $\forall n\in\NN$, if $A_1,\dots,A_n$ are invertible matrices, then $\left(A_1\cdot\dots\cdot A_n\right)^{-1}=A_n^{-1}\cdot\dots\cdot A_1^{-1}$.
- definition of a
**subspace of $\RR^n$**: it is a subset $V\subseteq\RR^n$ satisfying the properties:- $\vec{0}\in S$
- $\forall \vec{u},\vec{v}\in S\ \vec{u}+\vec{v}\in S$
- $\forall \vec{u}\in S,\forall\alpha\in\RR\ \alpha\vec{u}\in S$

- examples of subspaces of $\RR^2$ and $\RR^3$:
- the
**trivial subspace**$\left\{\vec{0}\right\}$ - any line through the origin
- any plane through the origin
- the whole thing ($\RR^2$ or $\RR^3$ as a subspace of itself)

- the

**Maxiquiz 3 today**

*:*- Hand in
**I**and^{3}4**HW4**: 3.1.38, 3.2.{26, 36}, 3.3.{42, 44} **[Re]Read:**§3.5*Content:*- Another inductive proof: $\forall n\in\NN$, one can construct a segment of length $\sqrt{n}$ with a ruler and compass.
- here's a nice inductive proof: All pigs are yellow.
*[see if you can find the flaw in that proof.]* - going over some recent HW and quizzes
- the rest of today is
*subspace day:* - thinking more about
*subspaces of $\RR^n$...*Notice that the first part of the book's definition is actually not necessary as long as the subset $S$ is non-empty: if it has any vector $\vec{u}$ at all, then it has $0\cdot\vec{u}=\vec{0}$ as well. And the second and third parts of the book's definition are together saying that*subspaces are closed under linear combinations*.

So here is another, equivalent definition of what it means for a subset $S\subseteq\RR^n$ to be a**subspace**:- $S\neq\emptyset$ (remember the notation for the
*empty set*) - $\forall \vec{u},\vec{v}\in S,\forall\alpha,\beta\in\RR\ \ \alpha\vec{u}+\beta\vec{v}\in S$

- $S\neq\emptyset$ (remember the notation for the
- filling out our list of examples of
*subspaces:*- the
**trivial subspace**$\left\{\vec{0}\right\}$ - any line through the origin
- any plane through the origin
- a
*hyperplane*through the origin in higher dimensions (*e.g.,*the set of vectors $\begin{pmatrix}x_1\\x_2\\x_3\\x_4\end{pmatrix}\in\RR^4$ with components satisfying $a\,x_1+b\,x_2+c\,x_3+d\,x_4=0$, where $a,b,c,d\in\RR$, is a "three-dimensional hyperplane" in $\RR^4$ (think of it as a linear system with only one equation: it has three free variables, so three parameters are needed to specify a point on this hyperplane). - $\RR^n$ is itself a subspace of $\RR^n$. (Note that any other
subspace than this one, so any subspace of $\RR^n$ which is not
all of $\RR^n$, is called a
**proper**subspace.) - Spans are subspaces:

**Theorem:**$\forall n,k\in\NN$ and $\forall\vec{v}_1,\dots,\vec{v}_k\in\RR^n$, $\Span(\vec{v}_1,\dots,\vec{v_k})$ is a subspace of $\RR^n$

- the
- defining the
**row space**,**column space**, and**null space**of a matrix $A$, written $\operatorname{row}(A)$, $\operatorname{col}(A)$, and $\operatorname{null}(A)$ - for an $m\times n$ matrix $A$, $\operatorname{row}(A)$ and $\operatorname{null}(A)$ are vector subspaces of $\RR^n$, while $\operatorname{col}(A)$ is a vector subspace of $\RR^m$

**Miniquiz 9 today**

- Hand in
*:***[Re]Read:**§3.5*Content:*- All pigs are
*not*yellow, alas. - going over some recent HW and quizzes
- the rest of today is
*basis day:* - the span of a bunch of vectors is a subspace, but it need not
be an
*efficient*way to describe that subspace. Looking for a way to characterize an efficient set of vectors to build a subset, we defined a**basis**of a subspace of $\RR^n$ - examples of bases:
- the trivial subspace
*does not have a basis* - the
**standard basis of $\RR^n$**(which we have met before; it is the $n$ vectors $$\vec{e}_1=\begin{pmatrix}1\\0\\0\\\vdots\\0\\0\end{pmatrix},\vec{e}_2=\begin{pmatrix}0\\1\\0\\\vdots\\0\\0\end{pmatrix},\dots,\vec{e}_n=\begin{pmatrix}0\\0\\0\\\vdots\\0\\1\end{pmatrix},$$ where $\vec{e}_j$ is the vector in $\RR^n$ which has a $1$ in the $j^\text{th}$ component and $0$'s everywhere else; the dimension $n$ is not part of the notation $\vec{e}_j$, it must be understood from context) is, as the name suggests, a basis.

- the trivial subspace
- but note that subspaces always have many different bases... (examples)

- All pigs are
**Miniquiz 10 today**

*:***[Re]Read:**§3.5*Content:*- today is
*dimension day:* - we've seen that bases are very much
*not uniquely determined by their subspaces*... but the number of vectors in a basis does seem to be uniquely determined by the subspace (examples); hence we define the**dimension**of a subspace of $\RR^n$ *dimension*is*well-defined*, says**The Basis Theorem:**Given a subspace $S$ of $\RR^n$, any two bases of $S$ have the same number of vectors.- the proof of
*The Basis Theorem*... is pretty - from the
*row space*comes the**rank** - the
*rank*of a matrix is also the dimension of its column space - from the
*null space*comes the**nullity** **The Rank-Nullity Theorem**

- today is
**Miniquiz 11 today**

*:*- Hand in
**I**and^{3}5**HW5**: 3.5.{4, 40, 58, 62}*[middle two of these are not long or hard proofs, if you use the powerful results in §3.5 of the book; the last one requires some thought and working out details — but it is a very nice proof when you're done!]* **[Re]Read:**§3.5*Content:*- notice that the row space of a matrix doesn't change as we
do EROs to the matrix — the column space
*does*change - a basis of the row space of a matrix will consist of the non-zero rows of its RREF form.
- a little on the proof of
*The Rank[-Nullity] Theorem* - an example of finding the
*null space*of a matrix - please read
**The Fundamental Theorem of Invertible Matrices**over the weekend- notation:
**TFAE**= "the following are equivalent";*i.e.,*the following statements are all joined by "iff"

- notation:

- notice that the row space of a matrix doesn't change as we
do EROs to the matrix — the column space
**Maxiquiz 4 today**

- Hand in

*:***Read:**§3.6*Content:*- going some recent HW and quizzes
- more on finding bases of the
*row*,*column*, and*null spaces*. - repeating the
**Rank-Nullity Theorem**and mention its proof - discussion of the
**Fundamental Theorem of Invertible Matrices**[which, remember, you were supposed to read carefully over the weekend] - defining a
**linear transformation**

**Miniquiz 12 today**

*:***[Re]Read:**§3.6*Content:*- examples of linear transformations
- left multiplication by an $m\times n$ matrix as a linear transformation from $\RR^n$ to $\RR^m$

**Miniquiz 13 today**

*:***[Re]Read:**§3.6*Content:*- finding the matrix of a linear transformation $\RR^n\to\RR^m$
- composition of linear transformations

- Hand in
**I**and^{3}6**HW6**: 3.5.64*[hint: prove that every element of $\operatorname{col}(A+B)$ is the sum of an element of $\operatorname{col}(A)$ and an element of $\operatorname{col}(B)$; then explain why this suffices for the problem]*and 3.6.{4, 8, 44} **Miniquiz 14 today**

*:***[Re]Read:**§3.6*Content:***domain**,**codomain**,**range***nullity*and injectivity ("1-1ness") of a linear transformation

**Maxiquiz 5 today**

*:***[Re]Read:**§3.6*Content:*- notation for the matrix of a linear transformation $T$ will be $[T]$
- composition of linear transformations and the consequence for their matrices: if $S:\RR^n\to\RR^m$ and $T:\RR^m\to\RR^p$ are linear then the matrix $[T\circ S]$ of composition $T\circ S:\RR^n\to\RR^p$ satisfies $[T\circ S]=[T]*[S]$ (where "$*$" means matrix multiplication)
- remember: linear transformation $T:\RR^n\to\RR^m$ is 1-1 if and only if $\operatorname{nullity}([T])=0$
- a linear transformation $T:\RR^n\to\RR^m$ cannot be 1-1 if $n>m$
- if the linear transformation $T:\RR^n\to\RR^m$ is invertible then $n$ must equal $m$

- no miniquiz today, alas...

*:***Read:**§4.2*Content:*- the inverse of a linear transformation, if it exists, is a linear transformation
- definition of the
**determinant**- for $1\times 1$ matrices
- for $2\times 2$ matrices
- recursively for $n\times n$ matrices, where $n\ge 2$ —
which is essentially
**Laplace's Expansion Theorem**

- Hand in
**I**and^{3}7**HW7**: Chapter 3 Review Questions, p263: 14, 16, 18 **Miniquiz 15 today**

*:***[Re]Read:**§4.2*Content:*- the determinant, using the definition we gave, is
*well-defined* - if the square matrix $A$ has as row or column of zeros, then $\det(A)=0$
- if the square matrix $A$ is upper- or lower-triangular then $\det(A)$ is the product of the diagonal elements of $A$
- if we get the matrix $B$ by multiplying one of the rows of the square matrix $A$ by the constant $k$, then $\det(B)=k\det(A)$.
- if $A$ is $n\times n$, then $\det(kA)=k^n\det(A)$
- determinants of elementary matrices, and of matrices before and after EROs are done to them

- the determinant, using the definition we gave, is
**Miniquiz 16 today**

*:***[Re]Read:**§4.2*Content:*- if $A$ and $B$ are $n\times n$ matrices, then
$\det(AB)=\det(A)\det(B)$ — which is
*amazing*, because matrix multiplication is highly non-commutative, while multiplication of real numbers is commutative, yet $\det(\cdot)$ turns one into the other! - the determinant
**determines**if a matrix is invertible: for an $n\times n$ matrix $A$, $\det(A)=0\ \Leftrightarrow\ A$ is invertible.

- if $A$ and $B$ are $n\times n$ matrices, then
$\det(AB)=\det(A)\det(B)$ — which is
**Maxiquiz 6 today**

*:**Content:*- going over
*Maxiquiz 6* - some last details on theorems from last Friday
- for an $n\times n$ matrix $A$, $\det(A^T)=\det(A)$
- Review for Test I. See this review sheet

- going over
- Hand in
**I**and^{3}8**HW8**: 4.2.{46, 54, 56, 69} **Miniquiz 17 today**

*:***Test I in class today.**

*:*- Test I post-mortem.
- no miniquiz today, alas...

*:***Read:**§4.1 & §4.3*Content:*- defining
**eigenvectors**,**eigenvalues**, and**eigenspaces** - examples of $2\times 2$ matrices with 0, 1, and 2 distinct eigenvalues
**Theorem:**If $\lambda$ is an eigenvalue of the matrix $A$, then $\det(A-\lambda I_{n\times n})=0$.- an eigenspace is always a subspace of $\RR^n$; in fact, it is the nullspace of $A-\lambda I$.
- the
**characteristic polynomial/equation**of an $n\times n$ matrix - the
**algebraic multiplicity**of an eigenvalue - the
**geometric multiplicity**of an eigenvalue - examples of eigenspaces and multiplicities of eigenvalues

- defining
- no maxiquiz today, alas again...
- Hand in Test I revisions, if you like.

*:***[Re]Read:**§4.3*Content:*- eigenvalues of triangular matrices
- eigenvalues of invertible matrices
- eigenvectors corresponding to distinct eigenvalues are linearly independent
**similarity**of matrices, written $A\sim B$

**Miniquiz 18 today**

*:***Read:**§4.4*Content:*- ${}\sim{}$ is an equivalence relation
- properties shared by matrices $A$ and $B$ if they are similar:
- determinant
- invertibility
- rank
- characteristic polynomial
- eigenvalues

**diagonalizable**matrices- $A\in M_{n\times n}(\RR)$ is diagonalizable iff $\RR^n$ has a basis consisting of eigenvectors of $A$

- Hand in
**I**and^{3}9**HW9**: 4.1.{24, 26, 37}, 4.3.{20, 24} **Miniquiz 19 today**

*:***[Re]Read:**§4.4*Content:*- building a basis for the ambient $\RR^n$ out of bases of all
eigenspaces of an $n\times n$ matrix
- a $2\times 2$ example

**The Diagonalization Theorem**- things to notice about an invertible matrix, say called $P$:
- the columns of $P$ are a basis of $\RR^n$, call them $\vec{p}_1,\dots,\vec{p}_n$
- conversely, if $\{\vec{p}_1,\dots,\vec{p}_n\}$ is a basis of $\RR^n$, and $P$ is a matrix whose columns are these vectors $\vec{p}_j$, for $1\le j\le n$, then $P$ is invertible $n\times n$.
- multiplication on the left by $P$ transforms the standard basis of $\RR^n$ to the new basis consisting of the columns of $P$; that is, if $\vec{e}_1,\dots,\vec{e}_n$ is the standard basis (so $\vec{e}_j$ is really just the $j$th column of the $n\times n$ identity matrix, for $1\le j\le n$), then $\vec{p}_j=P\vec{e}_j,$ again for $1\le j\le n$.
- multiplication on the left by $P^{-1}$ transforms the basis $\{\vec{p}_1,\dots,\vec{p}_n\}$ into the standard basis: $\vec{e}_j=P\vec{p}_j,$ for $1\le j\le n$.

- what this has to do with
*diagonalization*:- if we can put together an entire basis $\{\vec{p}_1,\dots,\vec{p}_n\}$ of $\RR^n$ out of eigenvectors of some matrix $A$, so $A\vec{p}_j=\lambda_j\vec{p}_j$ for $1\le j\le n$, then building the matrix $P$ with columns from this basis, it will turn out that $P^{-1}AP$ is diagonal, with $\lambda_1,\dots,\lambda_n$ down the diagonal. [Why? because $$\left(P^{-1}AP\right)\vec{e_j}=P^{-1}\left(A\left(P\vec{e_j}\right)\right)=P^{-1}\left(A\vec{p_j}\right)=P^{-1}\left(\lambda_j\vec{p_j}\right)=\lambda_j\left(P^{-1}\vec{p_j}\right)=\lambda_j\vec{e}_j$$ so the standard basis of $\RR^n$ consists of eigenvectors of $P^{-1}AP$ with eigenvalues $\lambda_1,\dots,\lambda_n$ ... which means $P^{-1}AP$ is diagonal as claimed.]

- so we get the
**Diagonalization Theorem**, two versions:- if separate bases of all the eigenspaces of an $n\times n$ matrix $A$ when put together yield a basis of all of $\RR^n$, then $A$ is diagonalizable.
- if the geometric multiplicity equals the algebraic multiplicity for every eigenvalue of a matrix $A$, then $A$ is diagonalizable

- a corollary of the
*Diagonalization Theorem*is that if an $n\times n$ matrix $A$ has $n$ distinct eigenvalues, then it is diagonalizable - examples of the practical process of diagonalizing a matrix we are given, following the above procedure
- hints on the
**Jordan Canonical Form**, examples of $2\times2$ matrices

- building a basis for the ambient $\RR^n$ out of bases of all
eigenspaces of an $n\times n$ matrix
**Miniquiz 20 today**

*:***Read:**§5.1*Content:*- review of the
*Diagonalization Theorem*and some examples **orthogonal**and linearly independent sets of vectors in $\RR^n$- an
**orthonormal basis [ONB]**

- review of the
**Maxiquiz 7 today**

*:***[Re]Read:**§5.1*Content:*- going over
*Maxiquiz 7* **coordinates with respect to an ONB**— computing the coefficients by dot products- some convenient notation: the
**Kronecker delta**$\delta_{ij}$. - definition of
**orthogonal matrices**; notation $O(n)$ for the set of such - finally, we define the notation $GL_n(\RR)$ for the set of invertible $n\times n$ matrices
- alternative characterization of orthogonal matrices

- going over
**Miniquiz 21 today**- Hand in
**I**and^{3}10**HW10**: 4.3.{34, 36}, 4.4.{24, 40, 47}

*:***[Re]Read:**§5.1*Content:*- the effect multiplying by an orthogonal matrix on the dot product or the norm of vectors
- $O(n)$ is closed under products and inverses
- determinants of orthogonal matrices
- $GL_n(\RR)$ and $O(n)$ are
**groups**, meaning they are sets on which there is defined an associative multiplication (so, the multiplication is closed: products of elements in the set stay in in the set), there is a multiplicative identity, and every element has a multiplicative inverse. - useful notation: "$\exists!x$..." means "
**there exists a unique $x$ ...**"

**Miniquiz 22 today**

*:***Read:**§5.2*Content:*- defining
**orthogonal complement**$W^\perp$ (pronounced "W-perp") of a subspace $W\subseteq\RR^n$ - proved that for any subspace of $\RR^n$, $W^\perp$ is a also a subspace of $\RR^n$
- for a matrix $A$: perps of column- and row-spaces, nullspaces and nullspaces of $A^T$ (stated without proofs)
- orthogonal projections
- the
**Orthogonal Decomposition Theorem**

- defining
**Miniquiz 23 today**

*:***Read:**§5.3*pp. 399-403 only**Content:*- the
**Gramm-Schmidt Process**

- the
**Maxiquiz 8**handed out today, it is due Monday- Hand in
**I**and^{3}11**HW11**: 5.1.{28, 37*[hint: make an appropriate orthogonal matrix and use Theorem 5.6c]*}, 5.2.{24, 25}

*:***Read:**§5.4*Content:*- defining
**orthogonally diagonalizable** - starting the
**Spectral Theorem**: orthogonally diagonalizable implies symmetric - example of orthogonally diagonalizing a symmetric matrix: key seems to be to find an ONB of $\RR^n$ consisting of eigenvectors
- mentioned (proof next time) that symmetric implies eigenspaces
are orthogonal — this is the key step in the
*Spectral Theorem*

- defining
- hand in
**Maxiquiz 8** **Miniquiz 24 today**

*:***[Re]Read:**§5.4*Content:*- even more on the
**Spectral Theorem** - consequences of
*symmetry*... distinct eigenspaces are orthogonal - example application of the
*Spectral Theorem:*the sum of two orthogonally diagonalizable matrices is also orthogonally diagonalizable

- even more on the
**Miniquiz 25 today**

*:***Read:**§5.5 —*pp.425-432 (the part called "Quadratic Forms") only**Content:*- yet more on the
*Spectral Theorem* - defining
**quadratic form** - examples of quadratic forms: upward- and downward-pointing paraboloids and saddles
- diagonalization of quadratic forms
- brick-throwing demonstration: which is related to the
*inertia tensor*, and applying the Spectral Theorem yields something called*The Principal Axes Theorem*in physics

- yet more on the
**Miniquiz 26 today**- Hand in
**I**and^{3}12**HW12**: 5.3.{6, 8}, 5.4.{12, 14, 16}

*:***Read:**§6.1*Content:*- defining
**[abstract] vector space** - starting examples (and non-examples) of
*vector spaces:*- the trivial vector space $\{\vec{0}\}$
- $\RR^n$ with the usual vector addition and scalar multiplication
- $\RR^2$ with modified vector addition(s) is often
**not**a vector space - spaces of functions, such as:
- $\Ff(\RR)$ — the space of all functions on the real line $\RR$
- $C(\RR)$ — the space of continuous functions on the real line $\RR$
- $C^k(\RR)$ for $k\in\NN$ — the space of $k$ times continuously differentiable functions on the real line $\RR$
- $C^\infty(\RR)$ — the space of infinitely differentiable functions on the real line $\RR$
- $\Pp_k$ for $k\in\NN$ — the space of polynomials in one variable of degree at most $k$
- $\Pp$ — the space of all polynomials in one variable

- $M_{m\times n}$ — the space of $m\times n$ matrices, with the usual addition of matrices and scalar multiplication on matrices.

- defining
**Maxiquiz 9 today**

*:***[Re]Read:**§6.1*Content:*- more discussion of the vector spaces of functions we
defined last class -- these form a chain of subspaces:

$\qquad\Pp_1\subseteq\Pp_2\subseteq\dots\subseteq\Pp\subseteq C^\infty(\RR)\subseteq\dots C^2(\RR)\subseteq C^1(\RR)\subseteq C(\RR)\subseteq\Ff(\RR)$ - some algebraic (arithmetic?) properties in vector spaces which
are consequences of their definition,
*e.g.:*- in any vector space $V$, $0\vec{u}=\vec{0}\ \ \forall\vec{u}\in V$
- in any vector space $V$, $(-1)\vec{u}=-\vec{u}\ \ \forall\vec{u}\in V$
- in any vector space $V$, $\alpha\vec{0}=\vec{0}\ \forall \alpha\in\RR$

- defining
**[vector] subspace** - examples of
*subspaces* - how to check if something is a
*subspace*(it's closed under vector addition and scalar multiplication)

- more discussion of the vector spaces of functions we
defined last class -- these form a chain of subspaces:
**Miniquiz 27 today**

*:***[Re]Read:**§6.1 and**Read:**§6.2*Content:*- more about vector subspaces,
*e.g.,*the $\vec{0}$ in a subspace is the same vector as the $\vec{0}$ in the ambient space - another example: the subspaces of symmetric or skew-symmetric matrices in $M_{n\times n}$
- defining
**Span**in an abstract vector space *Span*as an intersection- defining
**linearly [in]dependent**in an abstract vector space, and examples

- more about vector subspaces,
- Hand in
**I**and^{3}13**HW13**: 5.5.{38, 54}, 6.1.{2, 6, 48} **Miniquiz 28 today**

*:***[Re]Read:**§6.2*Content:*- defining
**basis**in an abstract vector space, and examples - defining
**dimension**in an abstract vector space, and examples - defining
**[in]finite dimensional**for an abstract vector space, and examples - the
**Basis Theorem**in an abstract vector space - more on linear independence, particularly in infinite dimentional vector spaces

- defining
**Miniquiz 29 today**

*:***[Re]Read:**§6.2 and**Read:**§6.3*Content:***coordinates with respect to a basis**in an abstract vector space- the
**change of basis matrix**$P_{\Cc\leftarrow\Bb}$

**Maxiquiz 10 today**

*:**Content:*- going over
*Maxiquiz 10* - Review for Test II. See this review sheet

- going over
- Hand in
**I**and^{3}14**HW14**: 6.2.{34, 44}, 6.3.{12, 16}

*:***Test II in class today.**

*:*- Test II post-mortem.
- no miniquiz today, alas...

*:***Read:**§6.4*Content:*- a
**linear transformation**between abstract vector spaces - examples of
*linear tranformations:*- the
**zero transformation** - the
**identity transformation** - matrix multiplication
- differentiation in $\Pp$

- the
- properties of
*linear tranformations:*- they map $\vec{0}_V$ to $\vec{0}_W$
- they behave nicely with respect to the "additive inverse" operation $\vec{v}\mapsto-\vec{v}$.
- compositions of LTs are LTs

- a
- no maxiquiz today, alas again...
- Hand in Test II revisions, if you like.

*:***[Re]Read:**§6.4 and**Read:**§6.5*Content:*- linear transformations are determined by what they do to a basis, which is, however, completely free: that is, if we want to make a linear transformation $T:V\to W$, and if $\{\vec{v}_1,\dots,\vec{v}_n\}$ is a basis of $V$, we can choose any vectors $\vec{w}_1,\dots,\vec{w}_n$ we like in $W$, and there will be a unique linear $T$ which satisifies $$T(\vec{v}_1)=\vec{w}_1,\quad\dots,\quad T(\vec{v}_n)=\vec{w}_n\ \ .$$
- more examples of
*linear tranformation* - the
**kernel**of a linear transformation - the
**range**of a linear transformation **one-to-one**(or**1-1**or**injective**)**onto**(or**surjective**)**inverses of linear transformations**

**Miniquiz 30 today**

*:***[Re]Read:**§6.5*Content:*- kernels and ranges are always vector subspaces
**rank**and**nullity**(again)**The Rank-Nullity Theorem**(again)**isomorphisms**and**isomorphic**(written "$\cong$")- a theorem giving a necessary and sufficient condition for finite dimensional vector spaces to be isomorphic

**Miniquiz 31 today**

*:***Read:**§6.6*Content:*- defining the
**matrix of a linear transformation with respect to bases of its domain and codomain** - the matrix of a composition of linear transformations
- the matrix of the inverse of a linear transformations
- matrices of endomorphisms of vector spaces and similarity using the change of basis matrix

- defining the
**Miniquiz 32 today**

*:***Read:**§7.1*Content:*- an
**inner product**and**inner product space** - examples of inner product spaces:
- $\RR^n$ with the usual dot product
- the $L^2$ inner product on $C([0,1])$
- $L^2$ also works on $\Pp$ and $\Pp_n$ ($n\in\NN$)
- another inner product on $\Pp_n$: $\lt p,q\gt=p(0)q(0)+\dots+p(n)q(n)$ (just using the first term $p(0)q(0)$ works in all parts of the definition of an inner product except for non-degeneracy; with all $n+1$ terms it becomes a full inner product ... which we know because of the Fundamental Theorem of Algebra!)

- an
- Hand in
**I**and^{3}15**HW15**: 6.4.{22, 24}, 6.5.{27, 34}, 6.6.16 **Maxiquiz 11 today**

**Thanksgiving Break!**No classes, of course.

*:***[Re]Read:**§7.1*Content:*- elementary properties of inner products
**length**or**norm**of vectors in an inner product space- the
**Pythagorean Theorem** **orthogonal**vectors in an inner product space**projections**and the**Gram-Schmidt Process**in an inner product space- an orthonormal set in the inner product space $C([-\pi,\pi])$ with the $L^2$ inner product: trigonometric functions, and the connection with Fourier Analysis and radios

**Miniquiz 33 today**

*:***Read:**§7.3*pp.591-604 only**Content:*- the
**best approximation of a vector $\vec{v}$ in a subspace $W$** - The
**Best Approximation Theorem** - the
**least squares approximation** - a
**least squares solution of a linear system**

- the
**Miniquiz 34 today**

*:***[Re]Read:**§7.3*pp.591-604 only**Content:*- the
**normal equations**corresponding to a given linear system - The
**Least Squares Theorem** - examples/application of least squares solutions

- the
**Miniquiz 35 today**

*:**Content:*- going over recent
*HW*s **review for the**; see this review sheet*Final Exam*next week

- going over recent
- Hand in
**I**and^{3}16**HW16**: 7.1.{16, 34, 40}, 7.3.{8, 16, 26} **Maxiquiz 12 today**

**Exam week**, no classes.*:*- Tody is the
**last day**to hand in all late work and re-dos for class credit. You can also pick up the graded*HW16*. There will be extra office hours all day today, as well, but please make an appointment for a specific time if you know when it will be (if you don't, just drop by anyway).

- Tody is the
*:***FINAL EXAM PART ONE from 8-10:20am in our usual classroom**

*:***FINAL EXAM PART TWO from 10:30am-12:50pm in our usual classroom**

Jonathan Poritz (jonathan@poritz.net) |
Page last modified: |