Semester II, 2019-20
Course Description
Content: Conditioning and stability, floating-point arithmetic. Numerical linear algebra: Vector and matrix norms, singular value decomposition, QR factorization, LU and Cholesky factorizations, conjugate gradient method, eigenvalue algorithms. Nonlinear problems: root finding, interpolation, numerical integration and differentiation, unconstrained and constrained optimization.
Objectives: At the end of the course, students will be able to:
- Analyze the conditioning of numerical problems and the stability of numerical algorithms
- State and derive the stability properties, computational complexity, and applicability criteria of standard numerical algorithms
- Identify the numerical algorithms best suited for given problems
- Solve complex problems by designing and implementing appropriate combinations of numerical algorithms
Textbooks:
- Heath, Scientific Computing: An Introductory Survey, 2nd Ed.
- Trefethen and Bau, Numerical Linear Algebra
- Boyd and Vandenberghe, Convex Optimization
Prerequisites: COL 100 / CSL 100 / CSL 101 / CSL 102 or equivalent. Overlaps with MTL 704. Familiarity with linear algebra and calculus is assumed.
Lectures
- Monday, 30 Dec: Introduction, conditioning (Heath 1.1, Trefethen & Bau 12)
- Practice exercises: Heath review questions 1.14, 1.15, 1.18–1.20, exercises 1.1–1.5
- Thursday, 2 Jan: Floating-point arithmetic, accuracy and stability (Trefethen & Bau 13, 14, 15)
- Monday, 6 Jan: Review of linear algebra (T&B 1, 2)
- Thursday, 9 Jan: Norms (T&B 3)
- Practice exercises: T&B 2.4, 3.1–6
- Monday, 13 Jan: The singular value decomposition (T&B 4, 5)
- Thursday, 16 Jan: Projectors and QR factorization (T&B 6, 7)
- Practice exercises: T&B 6.1–2, 6.5, 7.1–5
- Monday, 20 Jan: Householder triangularization (T&B 8, 9, 10)
- Thursday, 23 Jan: Solving linear and least-squares problems (T&B 16, 11, 18)
- Practice exercises: T&B 16.1–2, 11.1, 11.3 (if using Python, use
scipy.linalg.solve
for Matlab’s \
), 18.1–2, 18.4
- Further reading: T&B 17 (“Stability of Back Substitution”)
- Monday, 27 Jan: LU factorization (T&B 20, 21, 22)
- Practice exercises: T&B 20.1–5, 21.1–3, 21.6–7
- Thursday, 30 Jan: Cholesky factorization (T&B 23)
- Practice exercises: T&B 23.1–2
- Monday, 10 Feb: Iterative methods (T&B 32, H 11.5)
- Practice exercises: H exercises 11.6–7
- Thursday, 13 Feb: Arnoldi and GMRES (T&B 33, 35)
- Practice exercises: T&B 33.2, 35.1–2, 35.4–6
- Monday, 17 Feb: The conjugate gradient method (T&B 38, 40)
- Thursday, 20 Feb: Eigenvalues (T&B 24, 25)
- Practice exercises: T&B 24.1–4 (use [
scipy.linalg.
]expm
for the matrix exponential), 25.1–2
- Saturday, 22 Feb: Eigenvalue iterations (T&B 26, 27)
- Practice exercises: T&B 26.1, 27.1, 27.3–6
- Monday, 24 Feb: The QR algorithm, conditioning of eigenproblems (T&B 28, Heath 4.3)
- Thursday, 27 Feb: Nonlinear equations (Heath 5.1–5.5.3)
- Practice exercises: Heath review questions 5.12, 5.16, Heath exercises 5.3, 5.4, 5.6, 5.12
- Monday, 2 Mar: Nonlinear systems, interpolation (H 5.6, 7.1–7.3.2)
- Practice exercises: H review questions 5.(21,23,24,29), 7.(8,11,15), H exercises 5.(8,9,11,13)
- Thursday, 5 Mar: Polynomial interpolation (H 7.3.3–5)
- Practice exercises: H review questions 7.(19,20), H exercises 7.(5–7,13,14)
- Further reading: H 7.4 (“Piecewise Polynomial Interpolation”); Fritsch and Carlson, “Monotone Piecewise Cubic Interpolation”, 1980
- Numerical integration (H 8.1–2)
- Numerical integration cont. (H 8.3–8.3.4)
- Practice exercises: H exercises 8.(1,4,7,9,10)
- Further reading: Linked in the slides
- Numerical integration and differentiation (H 8.3.5–6, 8.6–7)
- Practice exercises: H exercises 8.(2,11–15)
- Optimization, convexity (Boyd & Vandenberghe 2.1–2.3.2, 3.1–3.1.7, 3.2–3.2.5)
- Practice exercises: B&V exercises 2.(1,4,6,10,12,14,16), 3.(2,3,6,7,16,17,21,30)
- Unconstrained optimization (B&V 2.5, 4.1–4.1.2, 4.2–4.2.3, 9.1–9.1.1, 9.2, 9.3.2, 9.4–9.4.2, 9.4.4)
- Practice exercises: B&V exercises 2.(21,24,25), 4.1, 9.(1,6,8)
- Analysis of descent methods (B&V 9.1.2, 9.3.1)
- Practice exercises: B&V exercise 9.5
- Newton’s method for optimization (B&V 9.5)
- Practice exercises: B&V exercises 9.(9–11,30)
- Further reading: B&V 9.6 (“Self-concordance”)
- Equivalent optimization problems (B&V 4.1.3, 4.2.4)
- Practice exercises: B&V exercises 4.(3,5)
- Linear and quadratic constrained optimization problems (B&V 4.3, 4.4)
- Practice exercises: B&V exercises 4.(8,9,11,12,15,21)
- Lagrange duality (B&V 5.1–5.1.5, 5.2–5.2.2)
- Practice exercises: B&V exercises 5.(1,2,4,7(a,b),12)
- Strong duality and optimality conditions (B&V 5.2.3–5.2.4, 5.4, 5.5–5.5.4)
- Practice exercises: B&V exercises 5.(21(a–c),24,26,27,31)
- Equality constrained optimization (B&V 10.1–10.1.3)
- Practice exercises: B&V exercises 10.(1,2)
- Equality constrained Newton’s method (B&V 10.2–10.3.2)
- Practice exercises: B&V exercises 10.(5,6,8)
- Inequality constrained optimization (B&V 11.1–11.3.2)
- Practice exercises: B&V exercises 11.(2,6,7,10)
- Feasibility and phase I methods (B&V 11.4)
Evaluation
- Homework assignments: 30% (dates for future assignments are tentative)
- Minor I: 20%
- Minor II: 20%
- Major: 30%
Students may use either Matlab or Python 3 with Numpy/Scipy for the programming component of the homework. If you do not have prior experience with either, I suggest choosing Python; please go through one of the following tutorials to familiarize yourself:
Grading: Following institute policy, a minimum of 80% marks are required for an A grade, and minimum 30% marks for D.
Late policy: Homework assignments are due at midnight on the due date. You are allowed a total of 5 late days across all the assignments. Any assignment submitted late after the total allowed late days have been used will not be graded.
Audit policy: Earning marks equivalent to a C grade or better is required for audit pass.
Attendance policy: Attendance lower than 75% will result in a one-grade penalty (e.g. A to A–, or A– to B).
Collaboration policy: Adapted from Dan Weld’s guidelines, via Mausam:
Collaboration is a very good thing. On the other hand, cheating is considered a very serious offense. Please don’t do it! Concern about cheating creates an unpleasant environment for everyone. If you cheat, you get a zero in the assignment, and additionally you risk losing your position as a student in the department and the institute. The department’s policy on cheating is to report any cases to the disciplinary committee. What follows afterwards is not fun.
So how do you draw the line between collaboration and cheating? Here’s a reasonable set of ground rules. Failure to understand and follow these rules will constitute cheating, and will be dealt with as per institute guidelines.
- The Kyunki Saas Bhi Kabhi Bahu Thi Rule: This rule says that you are free to meet with fellow students(s) and discuss assignments with them. Writing on a board or shared piece of paper is acceptable during the meeting; however, you should not take any written (electronic or otherwise) record away from the meeting. This applies when the assignment is supposed to be an individual effort or whenever two teams discuss common problems they are each encountering (inter-group collaboration). After the meeting, engage in a half hour of mind-numbing activity (like watching an episode of Kyunki Saas Bhi Kabhi Bahu Thi), before starting to work on the assignment. This will assassure that you are able to reconstruct what you learned from the meeting, by yourself, using your own brain.
- The Right to Information Rule: To assure that all collaboration is on the level, you must always write the name(s) of your collaborators on your assignment. This also applies when two groups collaborate.