Math 307 — Introduction to Linear Algebra

Course Schedule & Homework Assignments

Here is a link back to the course syllabus/policy page.

In the following, all sections and page numbers refer to the required course
textbook, * Linear Algebra, A Modern Introduction
(2^{nd} edition)*, by David Poole.

Also in the following, the image means that class was videoed that day and can be seen through the Blackboard page for this class. Note that if you know ahead of time that you will miss a class, you should tell me and I will be sure to video that day.

This schedule is will be changing **very frequently**, please check it at
least every class day, and before starting work on any assignment (in case the
content of the assignment has changed).

*M:**Content:*- bureaucracy and introductions
- what is Linear Algebra (the study of vector spaces and linear transformations...)
- why do we do so much abstraction and formality at this point of the mathematics curriculum: what abstraction is good for

**Miniquiz 0**

*T:***Read:***To the Student, p. xxiii*and §§ 1.1–1.3**Miniquiz 1***Journals*are**not due**today; it's a bit too early in the term. (First*Journal*entry will be due next Tuesday.)*Content:*- some basic terminology and notation:
- logical and basic set theoretic terminology/notation
- some basic sets of numbers
**natural numbers**$\NN$**integers**$\ZZ$**rationals**$\QQ$**real numbers**$\RR$

- starting good definitional style, including
variables must be "bound"**all**- clearly identify the symbol and/or terminology being defined
- clearly identify the type of object being defined

**vectors in $\RR^n$****vector addtion****scalar multiplication**- the
**dot product** **norms**

- some basic properties
- of vector arithmetic
- of dot products and norms
- the triangle inequality

- some basic terminology and notation:
- Do
**HW0:***Send me e-mail*(to jonathan.poritz@gmail.com)*telling me*:- Your name.
- Your e-mail address. (Please give me one that you actually check fairly frequently, since I may use it to contact you during the term.)
- Your year/program/major at CSUP.
- The reason you are taking this course.
- What you intend to do after CSUP, in so far as you have an idea.
- Past math classes you've had.
- Other math and science classes you are taking this term, and others you intend to take in coming terms.
- Your favorite mathematical subject.
- Your favorite mathematical result/theorem/technique/example/problem.
- Anything else you think I should know (disabilities, employment
or other things that take a lot of time,
*etc.*) - [Optional:] If you were going to be trapped on a desert island alone for ten years, what music would you like to have?

*W:***Read:**§§ 2.1 & 2.2**Miniquiz 2***Content:*- more basic terminology and notation...
**complex numbers****angles between vectors****quantifiers**$\forall$ and $\exists$- more on what makes a good defintion
**[systems of] linear equations****solutions of linear systems**, the**solution set**and its structure**a[n in]consistent**linear system- the
**coefficient**and**augmented matrices**of a linear system **elemenatry row operations (EROs)****row-equivalent matrices****[in]homogeneous**linear systems

- more basic properties
- solution sets of linear systems are either empty, have exactly one point, or have an infinite number of points

- more basic terminology and notation...

*F:***[Re]Read:**§2.2*Content:*- yet more basic terminology and notation...
**angles**between vectors**orthogonal**vectors- a
**[non]trivial**solution of a linear system - relation between [non]trivial solutions, [non]homogeneous linear systems, and [non]unique solutions
- more on what makes a good defintion
- what makes a good statement of a result (theorem, propostion,
*etc.*) - [starting on] what makes a good proof
- proof structures:
**unpacking**("follow your nose")**contradiction**("it can't*not*be true")- many more to come...

- the
**[reduced] row-echelon form of a matrix** **row-reduction****free variables**- the
**rank**of a matrix

- more basic properties
- The
**Rank-Nullity Theorem**

- The

- yet more basic terminology and notation...
**Maxiquiz 1 handed out today, due on Monday****Today [Friday] is the last day to add classes.**

*M:***[Re]Read:**§2.2 and**Read**§2.3*Content:*- terminology we needed to say out loud:
**Gaussian elimination****Gauss-Jordan elimination**

- noticing that the definition we gave of the
*angle*between two vectors needed $|\vec{v}\cdot\vec{w}| < 1$ in order to make sense; this is the content of the**Cauchy-Scharz-Buniakowsky**Inequality - discussion: we say that some new term is
**well-defined**if its definition makes sense,*e.g.:*- any formulæ in the definition can be successfully computed (as, for example, the Cauchy inequality shows we can compute the $\arccos$ in the definition of the angle between two vectors)
- any value expressed in a definition must be clear,
unambiguous, and
*unique*(as, for example, the number of non-zero rows in the RRE version of a matrix is a uniquely defined number, which makes the rank of a matrix be well-defined)

**linear combination****span**

- terminology we needed to say out loud:
- Hand in
**HW1:**1.2.62, 2.2.44, 2.2.47 - Hand in
**Maxiquiz 1** **Miniquiz 3**

*T:***[Re]Read:**§2.3*Content:*- going over
*Miniquiz 3*,*Maxiquiz 1*and*HW1* **context**and**type**for definitions ... see the handout on definitions for more on this theme- some elementary facts about $\Span$ such as that each of the vectors $\vec{0},\vec{v}_1,\dots,\vec{v}_k$ is in the set $\Span(\vec{v}_1,\dots,\vec{v}_k)$.
**linearly [in]dependent**vectors — note it is important that the scalars in the definition are*not all zero*- examples of linearly [in]dependent vectors
- starting the discussion of the relationship between [in]dependence and matrix multiplication by a matrix whose columns are the vectors under consideration

- going over
**Miniquiz 4**

*W:***[Re]Read:**§2.3*Content:*- more examples of
*linearly [in]dependent*vectors - proving vectors are
**linearly [in]dependent** - the relationship of
**linear [in]dependence**and the linear system whose coefficient matrix has columns which are the vectors under consideration:- there will be a solution to the system if and only if the RHS vector is in the span of the columns
- there will be a non-trivial solution of the homogeneous linear system with that coefficient if and only if the vectors are linearly dependent

- more examples of
**Miniquiz 5**

*F:***Read:**§3.1*Content:*- proofs by induction (
*e.g.,*3.1.37)

- proofs by induction (
- Hand in
**HW2:**2.3.43 and Chapter 2 Review problem 18 (on*p.133*) **Maxiquiz 2 today**

*M:***[Re]Read:**§3.1 and**Read**§3.2*Content:*- going over
*Maxiquiz 2*— the moral was*write down the definitions, it's often enough!* - recall from Math 207 and your reading of the book the basic
terminology:
**matrices**- matrix addition
- scalar multiplication with matrices
- matrix multiplication
- the
**identity matrix** - matrix algebra

- in class we recalled the definition of
**transpose** - starting the proof (by induction!) that the transpose of the sum of $k$ matrices is the sum of the transposes of those matrices, $\forall k$.

- going over
**Miniquiz 6****Today [Monday] is the last day to drop classes without a grade being recorded**

*T:***[Re]Read:**§3.2 and**Read**§3.3*Content:*- finishing the induction proof that the transpose of the sum of $k$ matrices is the sum of the transposes of those matrices, for any $k$.
- how to submit electronic
*Journal*entries and*HW*s. - some work with matrix multiplication:
- a very condensed version of the definition
- remember: this operation is
*very rarely commutatitve*

- stated without proof that the transpose of the product of $k$
square matrices is the product
*in the opposite order*of the transposes of those matrices; noted that a proof of this would also use induction. **[skew-]symmetric**matrices, properties:- symmetric and skew-symmetric matrices must be square
- a skew-symmetric matrix always has zeros on the diagonal

- Hand in
**HW3:**3.1.37 - Hand in (or submit electronically) your
**Journal 1 on §§1.1-1.3, 2.1, & 2.2**and**Journal 2 on §§2.3, 3.1, & 3.2**. See this link (or Blackboard) for more info on what is expected of you and how to do it. **Miniquiz 7**

*W:***[Re]Read:**§3.3*Content:*- going over
*HW2**style*is important in proofs! ...*Be stylish!*- it helps to have a guess as to what you want to prove, then
simply to write down all the definitions and see how they
relate to each other (the
**unpacking strategy**, after you decide what you want to unpack)

- a very
**logic**al day:- the details of an
**if–then**statement,*i.e.,*one in the form $P\Rightarrow Q$ - the
**converse**of $P\Rightarrow Q$ (which is $\neg P\Rightarrow\neg Q$). - the
**contrapositive**of $P\Rightarrow Q$ (which is $\neg Q\Rightarrow\neg P$). - the
**negation**of $P\Rightarrow Q$ (which is $P\land\neg Q$) - if $P\Rightarrow Q$ is true, the converse may or may not be true
- an "if-then" statement is true if and only if its contrapositive is true
- the negation of a statement of the form $\forall x\ P(x)$ is $\exists x\ \neg P(x)$
- the negation of a statement of the form $\exists x\ P(x)$ is $\forall x\ \neg P(x)$

- the details of an

- going over
*therefore,*discussing what is the negation of the statement $\forall a,b,c\in\RR\ (a\vec{u}+b\vec{v}+c\vec{w}=\vec{0}\Rightarrow a=b=c=0)$

*F:*- Hand in
**HW4:**3.2.33 and 3.3.46 *Content:*- going over some recent HW and miniquizzes
- discussion of how thinking of and writing down proofs is a serious new skill we are working on in this class, so we should definitely not expect it to be terrifically easy and or to come quickly
- definition of the
**inverse**of a matrix, and what it means for a matrix to be**invertible** - proved the
**Theorem:**Given $A$ and $B$ invertible matrices, $A\,B$ will be invertible, and $(A\,B)^{-1}=B^{-1}\,A^{-1}$.

**Maxiquiz 3 today**

- Hand in

*M:***[Re]Read:**§3.3*Content:*- going over
*Maxiquiz 3*— the morals were:- "repeat until terms are gone..." really means
*induction*, which would be better to write out explicitly - when you do induction, you must:
- clearly write down the statement $S(n)$ over which you are inducting -- the hint will be if the thing to be proven has the structure $\forall n\in\NN\ S(n)$.
- make sure you do a reasonable base case
- clearly enunciate the logic of each step, with lots of
*transition words/phrases*

*be careful*with induction, if you are not, you can make mistakes —*e.g.,*in class we discussed an inductive proof that all pigs are yellow [which is unfortunately incorrect ... extra credit points to whomever can find the flaw in the proof!] — so your best bet is to stick carefully to the standard structure of induction proofs and to check over each step very carefully

- "repeat until terms are gone..." really means
- discussion of where incautious induction can lead to false proofs
(
*e.g.,*the proof that all pigs are yellow) **elementary matrices**- the
**Fundamental Theorem of Invertible Matrices (version 1.0)** - discussion about how it is a very good idea to
*read the proofs in the book*, they will be a source of inspiration for your own proofs in the future.

- going over

*T:***Read:**§3.5*Content:*- discussed why the "all pigs are yellow" proof fails -- the inductive step doesn't work unless $n > 2$, so either that step needs a better proof, or one needs a different base case.
- the idea of a
**subspace of $\RR^n$** - examples of subspaces of $\RR^2$ (in fact, these are
*all*possible subspaces):- $\{\vec{0}\}$, the
**trivial subspace** - any line through the origin
- all of $\RR^2$

- $\{\vec{0}\}$, the
- notice all subspaces of $\RR^n$ have either exactly one vector in them (in which case we're talking about the trivial subspace), or an infinite number of vectors
- $\forall n,k\in\NN$ and $\forall\vec{v}_1,\dots,\vec{v}_k\in\RR^n$, $\Span(\vec{v}_1,\dots,\vec{v_k})$ is a subspace of $\RR^n$
- subspaces associated with a given matrix:
- the
**row space** - the
**column space**

- the

**Journal 3 on §§3.5 & 3.6**is**NOT**due this week... but don't forget to start thinking about it, there is a lot of material in these two sections. It will be due next Tuesday.**Miniquiz 8**

*W:***[Re]Read:**§3.5*Content:*- note that the definition of a subspce of $\RR^n$ would be equivalent to the one we use even without the specific requirement that $\vec{0}$ is in the subset, as long as the other parts are retained
**basis**, examples of bases**dimension**, computations of dimensions for simple cases

**Miniquiz 9**

*F:***[Re]Read:**§3.5*Content:***rank**- the
**Fundamental Theorem of Invertible Matrices 2.0**

- Hand in
**HW5:**3.5.4, 3.5.34 **Maxiquiz 4 today**

*M:***Read:**§3.6*Content:*- going over
*Maxiquiz 4* - the
**null space**of a matrix **nullity****The Rank[-Nullity] Theorem**

- going over
- Hand in
**HW6:**3.5.56, 3.5.60 (these are serious problems — don't hesitate to contact me and ask for a hint).

*T:***[Re]Read:**§3.6*Content:***The Basis Theorem****coordinates w.r.t. a basis**(what amounts to**Theorem 3.29**in §3.5)- a
**linear transformation** - examples of linear transformations:
- the
**trivial transformation**which sends every input to $\vec{0}$. - some random formulæ
**rotations**of $\RR^2$**reflections**of $\RR^2$

- the
- a
**Proposition:**If $f:\RR^n\to\RR^M$ is a linear transformation, then $f(\vec{0})=\vec{0}$.

**Journal 3 on §3.5 is due today****Miniquiz 10**

*W:***[Re]Read:**§3.6*Content:*- more examples of linear transformations
- a linear transformation coming from matrix multiplication — what the book (and almost no one else) calls a "matrix transformation"
- a linear transformation is determined by what it does to a basis
- the
**matrix $[T]$ of a linear transformation $T$**, with examples:- a counterclockwise rotation of $\RR^2$ by the angle $\theta$ has matrix $\begin{pmatrix}\cos\theta&\sin\theta\\ -\sin\theta&\cos\theta\end{pmatrix} $.
- a reflection of $\RR^2$ across the $y$-axis has matrix $\begin{pmatrix}-1&0\\ 0&1\end{pmatrix}$ .
- projection onto the $x$-axis in $\RR^2$ has matrix $\begin{pmatrix}1&0\\ 0&0\end{pmatrix}$ .

- composition of linear transformations
- it's linear, too
- its matrix is the product of the matrices of the consituent transformations

**Miniquiz 11**

*F:***Read:**§4.2*Content:*- recall about determinants:
- the determinant
**determines**if a matrix is invertible: $\det(A)=0\ \Leftrightarrow\ A$ is invertible. - $\forall A,B,\ \det(AB)=\det(A)\cdot\det(B)$ — which is
*amazing*, because matrix multiplication is highly non-commutative, while multiplication of real numbers is commutative, yet $\det(\cdot)$ turns one into the other

- the determinant

- recall about determinants:
**Maxiquiz 5 today**- Hand in
**HW7:**3.6.4, 3.6.8, 3.6.44

*M:***[Re]Read:**§4.2 and**Read:**§4.1*Content:*- going over
*Maxiquiz 5* - the
**identity transformation**of $\RR^n$ - the
**inverse**of a linear/matrix transformation - definition of the
**determinant**- for $1\times 1$ matrices
- for $2\times 2$ matrices
- recursively for $n\times n$ matrices, where $n\ge 3$ —
which is essentially
**Laplace's Expansion Theorem** - [there is actually a direct (=non-recursive) formula]

- properties of determinants:
- $\det(A)=0\ \Leftrightarrow\ A$ is invertible.
- $\forall A,\ \det(A)=\det(A^T)$
- $\forall A,B,\ \det(AB)=\det(A)\cdot\det(B)$
- determinants for triangular matrices

- going over
**Miniquiz 12**

*T:***[Re]Read:**§4.1 and**Read:**§4.3*Content:***eigenvectors**,**eigenvalues**, and**eigenspaces**- an eigenspace is always a subspace of $\RR^n$; in fact, it is the nullspace of $A-\lambda I$.
- the
**characteristic polynomial/equation**of an $n\times n$ matrix - the
**algebraic multiplicity**of an eigenvalue

**Journal 4 on §§3.6 & 4.2 is due today**- Hand in
**HW8:**4.2.53, 4.2.54, 4.2.69 **Miniquiz 13**

*W:***[Re]Read:**§4.4 and**Read:**§4.4*Content:*- the
**geometric multiplicity**of an eigenvalue - definition of matrix
**similarity** - examples of eigenspaces and multiplicities of eigenvalues

- the
**Miniquiz 14**

*F:***[Re]Read:**§4.4*Content:*- eigenvalues of triangular matrices
- eigenvalues of invertible matrices
- linear independence of eigenvectors corresponding to distinct eigenvalues
- properties of matrix similarity:
**reflexive****symmetric****transitive**- ...so it's an
**equivalence relation**

- properties in common to similar matrices
**diagonalizable**matrices- building a basis for the ambient $\RR^n$ out of bases of all
eigenspaces of an $n\times n$ matrix
- a $2\times 2$ example

**The Diagonalization Theorem**

**Maxiquiz 6 today**- Hand in
**HW9:**4.1.35, 4.1.37, 4.3.20-22

*M:**Content:*- going over
*Maxiquiz 6*and*HW9* - review for
*Midterm I*; see this review sheet - discuss any issues with
*HW10*— you are responsible for these problems for tomorrow's midterm!

- going over
**Journal 5 on §§4.1 & 4.3 is due today**— show it to me or submit it electronically, but you can keep it to study from, if you like, and hand it in for real tomorrow**Miniquiz 15**

*T:*- Hand in
**Journal 5**, if you are doing it on paper - Hand in
**HW10:**4.4.40-42, 4.4.47-48 *Midterm I*in class today

- Hand in
*W:***Read:**§5.1*Content:*- going over
*Midterm I*

- going over

*F:***[Re]Read:**§5.1*Content:*- things to notice about an invertible matrix, say called $P$:
- the columns of $P$ are a basis of $\RR^n$, call them $\vec{p}_1,\dots,\vec{p}_n$
- conversely, if $\{\vec{p}_1,\dots,\vec{p}_n\}$ is a basis of $\RR^n$, and $P$ is a matrix whose columns are these vectors $\vec{p}_j$, for $1\le j\le n$, then $P$ is invertible $n\times n$.
- multiplication on the left by $P$ transforms the standard basis of $\RR^n$ to the new basis consisting of the columns of $P$; that is, if $\vec{e}_1,\dots,\vec{e}_n$ is the standard basis (so $\vec{e}_j$ is really just the $j$th column of the $n\times n$ identity matrix, for $1\le j\le n$), then $\vec{p}_j=P\vec{e}_j,$ again for $1\le j\le n$.
- multiplication on the left by $P^{-1}$ transforms the basis $\{\vec{p}_1,\dots,\vec{p}_n\}$ into the standard basis: $\vec{e}_j=P\vec{p}_j,$ for $1\le j\le n$.

- what this has to do with
*diagonalization*:- if we can put together an entire basis $\{\vec{p}_1,\dots,\vec{p}_n\}$ of $\RR^n$ out of eigenvectors of some matrix $A$, so $A\vec{p}_j=\lambda_j\vec{p}_j$ for $1\le j\le n$, then building the matrix $P$ with columns from this basis, it will turn out that $P^{-1}AP$ is diagonal, with $\lambda_1,\dots,\lambda_n$ down the diagonal. [Why? because $$\left(P^{-1}AP\right)\vec{e_j}=P^{-1}\left(A\left(P\vec{e_j}\right)\right)=P^{-1}\left(A\vec{p_j}\right)=P^{-1}\left(\lambda_j\vec{p_j}\right)=\lambda_j\left(P^{-1}\vec{p_j}\right)=\lambda_j\vec{e}_j$$ so the standard basis of $\RR^n$ consists of eigenvectors of $P^{-1}AP$ with eigenvalues $\lambda_1,\dots,\lambda_n$ ... which means $P^{-1}AP$ is diagonal as claimed.]

- so we get the
**Diagonalization Theorem**, two versions:- if separate bases of all the eigenspaces of an $n\times n$ matrix $A$ when put together yield a basis of all of $\RR^n$, then $A$ is diagonalizable.
- if the geometric multiplicity equals the algebraic multiplicity for every eigenvalue of a matrix $A$, then $A$ is diagonalizable

- a corollary of the
*Diagonalization Theorem*is that if an $n\times n$ matrix $A$ has $n$ distinct eigenvalues, then it is diagonalizable

- things to notice about an invertible matrix, say called $P$:
- Hand in revised solutions to
*Midterm I*, if you like

*M:***[Re]Read:**§5.1*Content:***orthogonal**and linearly independent sets of vectors in $\RR^n$- an
**orthonormal basis [ONB]** **coordinates with respect to an ONB**— computing the coefficients by dot products- defining
**orthogonal matrix**and the set $O(n)$ of $n\times n$ orthogonal matrices

**Miniquiz 16**(on §5.1!)

*T:***[Re]Read:**§5.1*Content:***orthogonal matrices**:- definition
- alternative characterizations
- their effect on the dot product or the norm
- the set of such is closed under products and inverses
*[Hence we call the set $O(n)$ of orthogonal matrices the***orthogonal group**.] - determinants of orthogonal matrices

**Miniquiz 17**

*W:***Read:**§5.2- Hand in
**HW11:**5.1.26, 5.1.28(a)&(b) [read but do not do 5.1.28(c)&(d)] *Content:*- defining
**orthogonal complement**$W^\perp$ (pronounced "W-perp") of a subspace $W\subseteq\RR^n$ - for a matrix $A$: perps of column- and row-spaces, nullspaces and nullspaces of $A^T$

- defining
**Miniquiz 18**

*F:***[Re]Read:**§5.2 and**Read:**§5.3*pp. 385-389 only**Content:*- orthogonal projections
- the
**Orthogonal Decomposition Theorem** - the
**Gramm-Schmidt Process**

**Maxiquiz 7 handed out today**- Hand in
**HW12:**5.2.27, 5.2.29

*M:***Read:**§5.4*Content:*- some discussion of problems from
**HW12** - defining
**orthogonally diagonalizable** - starting the
**Spectral Theorem**: orthogonally diagonalizable implies symmetric

- some discussion of problems from
**Journal 6 on §§5.1, 5.2, & 5.3 (only***pp. 385-389*in 5.3) is due today- Hand in
**Maxiquiz 7** - Hand in
**HW13:**5.2.25, 5.3.3 **Miniquiz 19**

*T:***[Re]Read:**§5.4*Content:*- consequences of
*symmetry*... having real eigenvalues... - more on the
*Spectral Theorem*

- consequences of
**Miniquiz 20**

*W:***[Re]Read:**§5.4*Content:*- even more on the
**Spectral Theorem**— examples

- even more on the
- Hand in
**HW14:**5.4.12, 5.4.14, 5.4.16 **Miniquiz 21**

*F:***Read:**§5.5*pp.411-426 only**Content:*- yet more on the
*Spectral Theorem* - brick-throwing demonstration

- yet more on the
**Maxiquiz 8 today****Today [Friday] is the last day to withdraw (with a***W*) from classes

*M:***[Re]Read:**§5.5*pp.411-426 only**Content:*- defining
**quadratic form** - examples of quadratic forms: upward- and downward-pointing paraboloids and saddles
- diagonalization of quadratic forms
- our throwing bricks demo from last week is related to the
*inertia tensor*, and applying the Spectral Theorem yields something called*The Principal Axes Theorem*in physics

- defining

*T:***Read:**§6.1*Content:*- defining
**vector space** - starting examples (and non-examples) of
*vector spaces:*- the trivial vector space $\{\vec{0}\}$
- $\RR^n$ with the usual vector addition and scalar multiplication
- $\RR^2$ with modified vector addition(s) is often
**not**a vector space

- defining
- Start
**HW15**, which is due tomorrow **Miniquiz 22**

*W:***[Re]Read:**§6.1*Content:*- some algebraic (arithmetic?) properties in vector spaces which
are consequences of their definition (
*e.g.,*in any vector space $V$, $0\vec{u}=\vec{0}\ \forall u\in V$). - more examples of
*vector spaces:*- spaces of functions, such as:
- $\Ff(\RR)$ — the space of all functions on the real line $\RR$
- $C(\RR)$ — the space of continuous functions on the real line $\RR$
- $C^k(\RR)$ for $k\in\NN$ — the space of $k$ times continuously differentiable functions on the real line $\RR$
- $C^\infty(\RR)$ — the space of infinitely differentiable functions on the real line $\RR$
- $\Pp_k$ for $k\in\NN$ — the space of polynomials in one variable of degree at most $k$
- $\Pp$ — the space of all polynomials in one variable

- $M_{m\times n}$ — the space of $m\times n$ matrices, with the usual addition of matrices and scalar multiplication on matrices.

- spaces of functions, such as:

- some algebraic (arithmetic?) properties in vector spaces which
are consequences of their definition (
- Hand in
**HW15:**5.5.32, 5.5.38, 5.5.42 **Miniquiz 23**

*F:***[Re]Read:**§6.1*Content:*- more discussion of the vector spaces of functions we
defined last class -- these form a chain of subspaces:

$\qquad\Pp_1\subseteq\Pp_2\subseteq\dots\subseteq\Pp\subseteq C^\infty(\RR)\subseteq\dots C^2(\RR)\subseteq C^1(\RR)\subseteq C(\RR)\subseteq\Ff(\RR)$ - defining
**[vector] subspace** - examples of
*subspaces* - how to check if something is a
*subspace*

- more discussion of the vector spaces of functions we
defined last class -- these form a chain of subspaces:
**Maxiquiz 9 today**

**Spring Break!**No classes, of course.- But please catch up on any old homeworks or re-dos of old assignments, to hand in on Monday after the break.
- Also, do not forget that
**HW16**is due the first day after break, and**Journal 7**the day after that

*M:***[Re]Read:**§6.1 and**Read:**§6.2*Content:*- going over
*Maxiquiz 9*and recent*HW*s - still more about vector subspaces
- another vector space example: $M_{m\times n}$ — the space of $m\times n$ matrices, with the usual addition of matrices and scalar multiplication on matrices.
- the subspaces of symmetric or skew-symmetric matrices in $M_{n\times n}$

- going over
- Hand in
**HW16:**6.1.{2, 6, 12, 48} **Miniquiz 24**

*T:***[Re]Read:**§6.2*Content:*- defining
**Span**in an abstract vector space - properties of
*Span*in an abstract vector space - defining
**linearly [in]dependent**in an abstract vector space, and examples - defining
**basis**in an abstract vector space, and examples - defining
**dimension**in an abstract vector space, and examples

- defining
**Journal 7 on §§5.4, 5.5 (only***pp. 411-418*), & 6.1 is due today**Miniquiz 25**

*W:***[Re]Read:**§6.2*Content:*- a little more care with finite and infinite sets of vectors which might be linearly [in]dependent
- defining
**[in]finite dimensional**for an abstract vector space, and examples - the
**Basis Theorem**in an abstract vector space

**Miniquiz 26**

*F:***[Re]Read:**§6.2*Content:*- yet more care with finite and infinite sets of vectors which might be linearly [in]dependent
- be careful of the definition of
**dimension**in the book, it doesn't quite work! **coordinates of a vector $\vec{v}$ with respect to a basis $\Bb$**in an abstract vector space: $[\vec{v}]_\Bb\in\RR^k$, if $\Bb$ consists of $k$ basis vectors

- Hand in
**HW17:**6.1.46, 6.2.{6, 34, 44} **Maxiquiz 10 today**

*M:***Read:**§6.3 and**[Re]Read:**§6.4*Content:*- going over
*Maxiquiz 10*and recent*HW*s - still more about coordinates:
- a
**change-of-basis matrix $P_{\Cc\leftarrow\Bb}$**for converting from basis $\Bb$ to basis $\Cc$, by $[\vec{v}]_\Cc=P_{\Cc\leftarrow\Bb}\,[\vec{v}]_\Bb$ - this $P_{\Cc\leftarrow\Bb}$ will be $k\times k$ if $\Bb$ (and thus also $\Cc$) consists of $k$ vectors
- $P_{\Cc\leftarrow\Bb}$ is invertible: in fact, $P_{\Cc\leftarrow\Bb}^{-1}=P_{\Bb\leftarrow\Cc}$

- a

- going over
**Miniquiz 27**

*T:***Read:**§6.4*Content:*- more examples of
*change-of-basis*matrices - definition of a
**linear tranformation**between abstract vector spaces

- more examples of
**Miniquiz 28**

*W:***[Re]Read:**§6.4*Content:*- examples of
*linear tranformations*- the
**zero transformation** - the
**identity transformation** - matrix multiplication
- differentiation in $\Pp$

- the
- review for
*Midterm II*; see this review sheet

- examples of
- Hand in
**HW18:**6.3.16, 6.3.21*[Hint: use*, 6.4.{20, 22, 24}**Thm 6.12b**.] **Miniquiz 29**(handed out as a take-home quiz, due Friday)

*F:**Midterm II*in class today**Journal 8 on §§6.2-6.4 is due today**

*M:**Content:*- going over
*Midterm II*

- going over

*T:***[Re]Read:**§6.4*Content:*- linear transformations are determined by what they do to a basis, which is, however, completely free: that is, if we want to make a linear transformation $T:V\to W$, and if $\{\vec{v}_1,\dots,\vec{v}_n\}$ is a basis of $V$, we can choose any vectors $\vec{w}_1,\dots,\vec{w}_n$ we like in $W$, and there will be a unique linear $T$ which satisifies $$T(\vec{v}_1)=\vec{w}_1,\quad\dots,\quad T(\vec{v}_n)=\vec{w}_n\ \ .$$
**composition of linear transformations****inverses of linear transformations**

- Hand in revised solutions to
*Midterm II*, if you like

*W:***Read:**§6.5*Content:*- last words about inverses of lienar transformations
- the
**kernel**of a linear transformation - the
**range**of a linear transformation **one-to-one**(or**1-1**or**injective**)**onto**(or**surjective**)

**Miniquiz 30**

*F:***Read:**§6.6*Content:*- kernels and ranges are always vector subspaces
**rank**and**nullity**(again)**The Rank-Nullity Theorem**(again)**isomorphisms**and**isomorphic**

- Hand in
**HW19:**6.4.{26, 32}, 6.5.{4, 24, 27, 34} **Maxiquiz 11 today**

*M:***Read:**§7.1*Content:*- going over
*Maxiquiz 11* - a theorem giving a necessary and sufficient condition for finite dimensional vector spaces to be isomorphic
- an
**inner product**and**inner product space** - examples of inner product spaces:
- $\RR^n$ with the usual dot product
- the $L^2$ inner product on $C[0,1]$

- going over
**Journal 9 on §§6.5 & 6.6 is due today****Miniquiz 31**

*T:***[Re]Read:**§7.1*Content:*- elementary properties of inner products
**length**or**norm**of vectors in an inner product space- the
**Pythagorean Theorem** **orthogonal**vectors in an inner product space**projections**and the**Gram-Schmidt Process**in an inner product space- an orthonormal set in the inner product space $C[-\pi,\pi]$ with the $L^2$ inner product: trigonometric functions, and the connection with Fourier Analysis and radios

**Miniquiz 32**

*W:***Read:**§7.2*pp.561-564 only**Content:*- the
**distance**between vectors in an inner product space - a
**norm**and a**normed linear space** - the
**distance function $d(\vec{u},\vec{v})$**in a normed linear space - a
**metric**and**metric space** - examples of norms/metrics:
- the norm coming from an inner product
- the
**sum norm**(also called the**$L^1$ norm**) - the
**max norm**(also called the**sup**or**$L^\infty$ norm**) - the
**taxicab metric**

- the
- Hand in
**HW20:**7.1.{36, 40}, 7.2.{8, 14} **Journal 10 on §§7.1 & 7.2***[only the parts we covered in this class]*is due today**Miniquiz 33****Hand in all late work and re-dos by 4pm today**if you want them corrected and returned to you on Friday

*F:**Content:*- going over recent
*HW*s **review for the**; see this review sheet*Final Exam*next week

- going over recent
**Last day to hand in [by noon!] all late work and re-dos for class credit***[although materials handed in today will not be returned]*

**Exam week**, no classes.- Our
*[comprehensive]*is scheduled for**FINAL EXAM****Monday, April 30th, 8:00-10:20 in our usual classroom****Tuesday, May 1st, 8:00-10:20 in our usual classroom**

**both**time slots. The format of the test is described on the final exam review sheet, but note that the parts in which you will have to state definitions and theorems will be on Monday (along with some other questions) — on Tuesday's part of the test, you will be able to refer to these definitions in your new work. So, in short: Tuesday is entirely problem-oriented, with little need for memorization, while for Monday, you will be required to have detailed, precise definitions and statements in your head.

Jonathan Poritz (jonathan.poritz@gmail.com) | Page last modified: Monday, 21-Jul-2014 04:56:31 UTC |