**330 LINEAR ALGEBRA I (3+0) 3 credits**

Vector analysis continued; abstract vector spaces; bases, inner products; projections; orthogonal complements, least squares; linear maps, structure theorems; elementary spectral theory; applications. Corequisite(s): MATH 283 R.

Instructor Course Section Time ------------------------------------------------------------------------ Eric Olson Math 330-1006 Linear Algebra 4:30-5:45pm TR AB 634

- Instructor:
- Eric Olson
- email:
- ejolson at unr edu
- Office:
- Tuesday and Thursday 2pm DMS 238 and by appointment.
- Homepage:
- http://fractal.math.unr.edu/~ejolson/330/
- Texts:
- Linear Algebra and Its Applications, 4th Edition by David C. Lay
- https://www.pearson.com/mylab

- State the Gram-Schmidt orthogonalization algorithm.
- Given A ∈
**R**^{m×n}, find the factorization A = QR where Q ∈**R**^{m×n}is a matrix with orthonormal columns and R ∈**R**^{n×n}is upper triangular. - Given the reduced QR factorization of a matrix A, find the vector x which minimizes ||Ax−b||.

- Definitions of the following terms:
- Symmetric matrix.
- Orthogonal matrix.
- Characteristic polynomial.
- Eigenvector and eigenvalue.
- Orthogonally diagonalizable.

- State the following theorems and algorithms:
- Section 7.1 Theorem 3 [The Spectral Theorem]: An n x n symmetric
real-valued matrix A has the following properties.
- A has n real eigenvalues, counting multiplicities.
- The dimension of the eigenspace for each eigenvalue λ equals the multiplicity of λ as a root of the characteristic equation.
- The eigenspaces are mutually orthogonal, in the sense that eigenvectors corresponding to different eigenvalues are orthogonal.
- A is orthogonally diagonalizable, that is, there exists an orthonormal basis consisting of eigenvalues of A.

- Section 5.8 [The Power Method for Estimating a Strictly
Dominant Eigenvalue]:
- Select an initial vector x
_{0}whose largest entry is 1. - For k = 0, 1, ...,
- Compute A x
_{k} - Let μ
_{k}be an entry in A x_{k}whose absolute value is as large as possible. - Compute x
_{k+1}= (1/μ_{k}) A x_{k}

- Compute A x
- For almost all choices of x
_{0}, the sequence { μ_{k}} approaches the dominant eigenvalue, and the sequence { x_{k}} approaches a corresponding eigenvector.

- Select an initial vector x

- Section 7.1 Theorem 3 [The Spectral Theorem]: An n x n symmetric
real-valued matrix A has the following properties.
- Know how to prove the following theorems:
- Section 7.1 Theorem 1: If A is real-valued and symmetric, then
any two eigenvalues from different eigenspaces are orthogonal.
- Theorem on Real Eigenvalues: The eigenvalues of a real-valued n x n symmetric matrix are all real.

- Section 7.1 Theorem 1: If A is real-valued and symmetric, then
any two eigenvalues from different eigenspaces are orthogonal.
- Compute the characteristic polynomial of a matrix.
- Compute the eigenvalues of a matrix using
the characteristic equation.
- Find the eigenvectors of a matrix from the eigenvalues.

- Definitions of the following terms:
- Determinant (using recursive definition).
- Stochastic matrix.

- Proof of the following theorems:
- Section 3.3 Theorem 7 [Cramer's Rule]: Let A be an invertible n×n
matrix. For any b in
**R**^{n}, the unique solution x of Ax=b has entries given byx _{i}= det(A_{i}(b))/det(A) where i = 1, 2, ..., n. - Section 4.5 Theorem 9: If a vector space V has a basis
B = {b
_{1}, ..., b_{n}}, then any set in V containing more than n vectors must be linearly dependent.

- Section 3.3 Theorem 7 [Cramer's Rule]: Let A be an invertible n×n
matrix. For any b in
- Given a matrix A in
**R**^{m×n}be able to use Gaussian elimination and pivoting if necessary to find the reduced row-eschelon form. -
Be able to compute
det(A) for A in
**R**^{n×n}given the factorization A=PLDU where L is lower triangular, D is diagonal, U is upper triangular and P is a permutation matrix.

- Definitions of the following terms:
- A linear function.
- The augmented matrix.
- Row-eschelon and reduced row-eschelon form.
- A subspace.
- The span of a set of vectors
span{v
_{1}, v_{2}, …, v_{k}}. - Linear independence of a set of vectors.
- A basis of a subspace.
- The Null space Nul(A) of a matrix.
- The Column space Col(A) of a matrix.

- How to perform the elementary row operations.
- Elimination Step:
r
_{i}← r_{i}+ αr_{j}. - Scaling Step:
r
_{i}← αr_{i}. - Row Swap:
r
_{i}↔ r_{j}.

- Elimination Step:
r
- Be able to write the matrices which correspond to the
elementary row operations.
- Know how to write a system of linear equations as a
matrix equation of the form Ax=b and how to write the matrix
equation Ax=b as a system of linear equations.
- Be able to interchange the order of elimination and row-swap
matrices. For example,
[r _{2}← r_{2}+ 3r_{1}] [r_{1}↔ r_{3}] = [r_{1}↔ r_{3}] [r_{i}← r_{i}+ αr_{j}]for what values of i, j and α?

- Given a matrix A ∈
**R**^{m×n}use the elementary row operations to perform Gaussian elimination and find the row-eschelon and reduced row-eschelon forms of A. - Given the sequence of row operations used to find the
row-eschelon form, find the factorizations
A=LU, A=LDU and A=PLDU where L is lower triangular, D is diagonal, U is upper triangular and P is a permutation matrix.

- Given a matrix A ∈
**R**^{m×n}and it's reduced row-eschelon form R.- Find a basis for the column space Col(A).
- Find a basis for the null space Nul(A).

Quiz 1 20 points Quiz 2 40 points Quiz 3 60 points MyLab Math Online 40 points Final 100 points ------------------------------------------ 260 points totalExams and quizzes will be interpreted according to the following grading scale:

Grade Minimum Percentage A 90 % B 80 % C 70 % D 60 %The instructor reserves the right to give plus or minus grades and higher grades than shown on the scale if he believes they are warranted.

Last Updated: Tue Aug 27 12:09:11 PDT 2019