MTH 451- 551 : NUMERICAL LINEAR ALGEBRA - Fall 2014
Links
General information
Assignments
Assignments
The homework solutions must be written up clearly and neatly. Pay attention to the use of proper mathematical notation. Do not overdo the exposition but show enough work to justify your answers. When showing MATLAB work, make sure it is adequately labelled and referenced. (Do not just show a bunch of graphs and core dump of MATLAB diary stapled together). Group discussions are permitted but no exchange of written materials (including MATLAB code) is allowed; otherwise the cases will be considered as academic dishonesty.
The numbers below refer to Lecture.Problem from the textbook. Turn in only the problems assigned to your section (451 or 551). Homework is due in class at the beginning of class meeting.
  1. 9/29: Introduction and overview.
    Read: Part I, Lectures 1-3
  2. 10/1: Normed and inner product vector spaces. Vector and matrix norms.
    10/1 Office hours 11:00-12:00 and 12:00-13:00 in Math Learning Center computer lab (Kidd 108). Please come to the session you signed up for.
    These sessions are not mandatory. You can review/learn MATLAB on your own.
    HW 1 due Wed, 10/8: 1.1, 1.2, 2.2, 2.6, 3.2-3.5
    451 turn in: 1.1, (2.2 or 3.3), (1.2 or 2.6)
    551 turn in: 1.1, 2.2, (1.2 or 2.6), 3.3
  3. 10/3: Continue matrix norms; examples.
  4. 10/6: Floating point artithmetic and conditioning (Read Lectures 12-15).
    HW 2 (MATLAB) due Fri, 10/10:
    What is the result in MATLAB of inf, 0*inf, 1/0, 0/0 ?
    (*) Experiment in MATLAB to find a number x for which (1-(x+1))/x is not what it should be. Calculate absolute and relative error for this operation. How did you find x ?
    451 turn in: (*), 13.2c, 13.3, 12.3a
    551 turn in: (*), 13.2c, 13.3, 12.3a-c
    Hints on coding that may be useful for this HW.
    To superimpose multiple plots on the same picture (as in 12.3), use hold on.
    The following code can be helpful for 13.2c (you must modify it)
    n=1; while n < inf, n=n*10,end
    Eigenvalues of a random matrix A will likely be complex, so when plotting them with plot, what do you see ? What about the eigenvalues of A'*A ? (See Pbm 2.3).
  5. 10/8: Examples of conditioning. Relative and absolute error in solving linear systems.
  6. 10/10: Positive definite matrices, and quadratic forms. Minimization of an spd quadratic form over unit vectors.
    HW 3 due Friday, 10/17:
    451 turn in (4 problems): 4.1, 5.3, either 4.3 (MATLAB) or 4.2, 4.5
    551 turn in (5 problems): 4.1, 5.3, 4.3 (MATLAB) and 4.2, 4.5
    Extra: 4.4, 5.1, 5.4
  7. 10/13: SVD and examples.
  8. 10/15: continue SVD examples for rectangular matrices. Reduced/thin SVD.
  9. 10/17: projections, QR decomposition, and orthogonalization (Gram-Schmidt). Lectures 6-8 and 10.
  10. 10/20: modified Gram-Schmidt and Householder reflections.
    HW on orthogonality and QR (do not turn in): 6.1, 6.3, 6.4, 7.1, 7.5, 8.1, 8.3, 10.4; Repeat Experiment 2 in Lecture 9.
  11. 10/22: no class
  12. 10/24 MIDTERM Friday, 10/24 A note card (index size) will be allowed. You will be asked to staple the note card to your exam.
    Scope of exam: review matrix norms and conditioning. SVD and QR. Flops (operation counts), and basic ideas about stability. Make sure you know how to compute an inverse of a 2x2 matrix !
    HW 4 due Friday, 10/31:
    Consider the matrix 4x3 given in MATLAB notation as A=[1,1,1;a,0,0;0,a,0;0,0,a] where a is a small number such that a^2 is less than machine epsilon (that is, adding a^2 to anything does not change its value).
    1. Perform (by hand computation) the steps of the classical (cGS) and modified Gram-Schmidt (mGS) algorithm to orthogonalize the columns of A. You should discover the loss of orthogonality for cGS.
    2. Use a program in MATLAB to do the same thing as in 1) and compare results for large a and for a small a as above. (Show the program). Compare with QR algorithm.
    3. 551 (extra for 451). Use Householder reflections to do the job in 1).
    4. Prove by hand computation that the least squares solution minimizes the norm of the residual b-Ax for a general real m x n matrix, with m>n (details in class). Multivariable calculus techniques are only needed. (451 students can consider m=n=2 and generalize next to m>n=2.)
  13. 10/31: Stability and conditioning revisited. Stability and backward stability.
    Bound of the rrror of an algorithm to solve a given problem = conditioning of the problem + (backward) stability of the algorithm.
    [(Re)read lectures 13-15].
    HW (do not turn in): 13.1, 14.1, 14.2a, 15.1. Practice LSQ with arbitrary rectangular matrices of full rank and row dimension greater than column dimension extending those from Pbm 4.1. Use as right hand side e_2.
    HW 5 due Friday, Nov. 7:
    (a) 14.2b.
    (b) (triple credit) Solve by hand calculation using each of the three methods discussed in class (normal equations, via QR, via SVD) the LSQ problem with matrix B = [ 1, 2; 0,1; 1,0] and right hand side b=[3,0,0]^T. Your calculations should include computation of QR, SVD, the pseudo-inverse, as well as demonstration of the minimization of the appropriate quadratic function (as discussed in class).
    (c) Solve the LSQ problem (b) in MATLAB. (compare your work on QR and SVD and final answers to the one given by MATLAB. Explain any differences)
  14. 11/3/14: stability of backward substitution and LSQ. [Read lightly Lectures 17-19]
  15. 11/5/14: back to different methods for (overdetermined, full rank) LSQ, examples of LSQ, wrap-up.
    General proof of conditioning of solving a linear system when matrix is perturbed.
  16. 11/7/14: Finish the proof. Overview of Part IV: solving linear systems. Direct vs iterative methods. When Jacobi fails. What is the Cadillac of solving linear systems.
  17. 11/10/14: Gaussian elimination and (P)LU. [Read Lectures 20-22]
    HW 6 due Friday, Nov. 14: 20.3, 21.4, 21.6 (451 students can assume m=3). 22.2
    Do not turn in: 21.1, 21.2, 22.4
  18. 11/12/14: Why pivoting. Worksheet on stability of LU.
  19. 11/14/14: Spd matrices: where they come from, how to solve. Unitary orthogonalization of normal matrices.
    Direct method for spd matrices: Cholesky decomposition. [Read Lecture 23.]
    Review diagonalization and eigenvalues. [Lecture 24].
  20. 11/17/14: Theory of stationary iterative methods. [Read handout distributed in class]. Consistency, a-priori, and a-posterior error estimates. Example: Jacobi method (converges for strictly row diagonally dominant matrices).
  21. 11/19/14: Continue stationary methods. Application of Gershgorin theorem to prove that Jacobi method converges for strictly diagonally dominant matrices.
    HW (Do not turn in): 24.1
    HW 7 due Wednesday, Nov. 26 24.2 (451 do part c and d, 551 do all), 24.4 (451 do a, 551 do both. Then solve (451 do one, 551 do both) the following computational problems
    (C1) Implement Jacobi method to solve the problem Au=b where A is the (-discrete Laplacian in 1d of size m=1/h=10) and right hand side equals [1,1....1]^T.
    (C2) For the same matrix as in (C1), plot the spectral radius of the iteration matrix for Richardson iteratiton depending on parameter alpha. Find the optimal alpha for which you get the minimum spectral radius; compare with the theory given in the handout.
  22. 11/21/2014: More on spd matrices and the "A"-inner product. Steepest descent algorithm.
  23. 11/24/2014: Conjugate gradient algorithm and its properties. [Lecture 38]
  24. 11/26/2014: Preconditioning and convergence properties of (P)CG. Krylov subspaces. [Lecture 35, lightly].
    HW 8 due Friday, Dec. 5: 38.3-4, 25.2.
  25. 12/1/2014: GMRES (lightly). Start numerical methods for eigenvalues. Demonstration of power iteration.
  26. 12/3/2014: Eigenvalues and eigenvectors using power iteraiton, inverse iteraiton, and Rayleigh iteration. Rayleigh quotient.
  27. 12/5/2014: Review.