# Semester II, 2021-22

## Course Description

Content: Conditioning and stability, floating-point arithmetic. Numerical linear algebra: Vector and matrix norms, singular value decomposition, QR factorization, LU and Cholesky factorizations, conjugate gradient method, eigenvalue algorithms. Nonlinear problems: root finding, interpolation, numerical integration and differentiation, unconstrained optimization.

Objectives: At the end of the course, students will be able to:

• Analyze the conditioning of numerical problems and the stability of numerical algorithms
• State and derive the stability properties, computational complexity, and applicability criteria of standard numerical algorithms
• Identify the numerical algorithms best suited for given problems
• Solve complex problems by designing and implementing appropriate combinations of numerical algorithms

Textbooks:

• Heath, Scientific Computing: An Introductory Survey, 2nd Ed.
• Trefethen and Bau, Numerical Linear Algebra
• Boyd and Vandenberghe, Convex Optimization

Prerequisites: COL106 or equivalent. Overlaps with MTL704. Familiarity with linear algebra and calculus is assumed.

## Lectures

1. Mon, 3 Jan: Introduction, conditioning (Heath 1.1–1.2.4, 1.2.6, Trefethen & Bau 12 up to “Relative Condition Number”)
• Practice problems: Heath exercises 1.1–1.5, computer problem 1.1
2. Thu, 6 Jan: Floating-point arithmetic, accuracy and stability (Trefethen & Bau 12–14)
• Practice problems: Heath exercises 1.6, 1.9–12, computer problems 1.3–8, Trefethen & Bau exercises 13.*, 14.*
• Further reading (optional): Goldberg, “What Every Computer Scientist Should Know About Floating-Point Arithmetic” (HTML, PDF), 1991
1. Mon, 10 Jan: More about stability, linear algebra (T&B 15, 1 up to “Range and Nullspace”)
• Practice problems: T&B exercises 15.1, 1.1–2
2. Thu, 13 Jan: Linear algebra, orthogonality (TB 1, 2)
• Practice problems: T&B exercises 1.3–4, 2.*
1. Mon, 17 Jan: Norms, SVD (TB 3, 4 up to “Reduced SVD”)
• Practice problems: T&B exercises 3.*
2. Thu, 20 Jan: More about SVD, projections (TB 4, 5, 6)
• Practice problems: T&B exercises 4.* (use `scipy.linalg.svd` for 4.3), 5.*, 6.*
1. Mon, 24 Jan: QR factorization (T&B 7, 8, 9)
• Practice problems: T&B exercises 7.*, 8.*, 9.* (all programming exercises should be done in Python using Numpy, Scipy, and Matplotlib)
2. Thu, 27 Jan: Householder triangularization, least-squares problems (T&B 10, 16, 11)
• Practice problems: T&B exercises 10.*, 16.*, 11.* (use `scipy.linalg.solve` for Matlab’s `\`)
• Further reading: T&B 17 (“Stability of Back Substitution”)
1. Mon, 31 Jan: Conditioning of least-squares, LU factorization (T&B 18 up to Theorem 18.1, 20, 21)
• Practice problems: T&B exercises 18.1–2, 18.3, 20.*, 21.*
2. Thu, 3 Feb: Stability of LU, Cholesky factorization, iterative methods (T&B 22, 23, 32)
1. Mon, 7 Feb: Arnoldi, GMRES (T&B 33, 35)
• Practice problems: T&B exercises 33.1–2, 35.*
2. Thu, 10 Feb: Conjugate gradients (T&B 38)
1. Mon, 21 Feb: Eigenvalues (T&B 24, 25, 26)
2. Thu, 24 Feb: Eigenvalue algorithms (T&B 27, 28)
• Practice problems: T&B exercises 27.*, 28.* (look up convex hull on p. 24 of Boyd & Vandenberghe)
1. Thu, 3 Mar: Conditioning of eigenvalues, nonlinear equations (Heath 4.3, 5.1–5.5)
• Practice problems: Heath review questions 5.5, 5.9–14
1. Mon, 7 Mar: Nonlinear systems of equations (H 5.5.3–4, 5.6)
2. Thu, 10 Mar: Interpolation (H 7.1–7.3.3)
• Practice problems: H review questions 7.8–17, exercises 7.3–6, computer problems 7.1–2
3. Sat, 12 Mar: Piecewise polynomial interpolation, numerical integration (H 7.3.5–7.4.2, 8.1–8.2)
1. Mon, 14 Mar: Numerical integration (H 8.3–8.3.3)
2. Thu, 17 Mar: Numerical integration, differentiation, optimization (H 8.3.5–6, 8.6, Boyd & Vandenberghe 1)
• Practice problems: Heath exercises 8.11–13, computer problems 8.1(a,c), 8.2, 8.18 (look up the relevant routines in Scipy, e.g. `scipy.integrate.quad`)
• Further reading: Heath 8.6, “Richardson Extrapolation”
1. Mon, 21 Mar: Convex sets and functions (B&V 2.1–3, 3.1–2)
• Practice problems: B&V exercises 2.1–16, 3.1–12, 3.15–22, 3.28–32
• Further reading: B&V Appendix A.4, “Derivatives”
2. Thu, 24 Mar: Optimization (B&V 4.1–2, 9.1–2)
• Practice problems: B&V exercises 9.1-3, 9.5
1. Mon, 28 Mar: Descent methods (B&V 9.3–4)
• Practice problems: B&V 9.6, 9.8
2. Thu, 31 Mar: Newton’s method (B&V 9.5)
• Practice problems: B&V 9.(4,10,11,24,25)
• Further reading: H 6.5.5, “Secant Updating Methods”; Heath 6.6, “Nonlinear Least Squares”

## Evaluation

Students should use Python 3 with Numpy/Scipy for the programming component of the homework. If you do not have prior experience with these libraries, please go through one of the following tutorials to familiarize yourself:

Grading: Following institute policy, a minimum of 80% marks are required for an A grade, and minimum 30% marks for D.

Late policy: Homework assignments are due at midnight on the due date. You are allowed a total of 5 late days across all the assignments. A maximum of 2 late days can be used for each assignment. Any assignment submitted late after the total allowed late days have been used will not be graded.

Audit policy: A minimum of 40% marks is required for audit pass.

Attendance policy: Attendance lower than 50% may result in a one-grade penalty (e.g. A to A–, or A– to B).

Collaboration policy: Adapted from Dan Weld’s guidelines, via Mausam:

Collaboration is a very good thing. On the other hand, cheating is considered a very serious offense. Please don’t do it! Concern about cheating creates an unpleasant environment for everyone. If you cheat, you get a zero in the assignment, and additionally you risk losing your position as a student in the department and the institute. The department’s policy on cheating is to report any cases to the disciplinary committee. What follows afterwards is not fun.

So how do you draw the line between collaboration and cheating? Here’s a reasonable set of ground rules. Failure to understand and follow these rules will constitute cheating, and will be dealt with as per institute guidelines.

• The Kyunki Saas Bhi Kabhi Bahu Thi Rule: This rule says that you are free to meet with fellow students(s) and discuss assignments with them. Writing on a board or shared piece of paper is acceptable during the meeting; however, you should not take any written (electronic or otherwise) record away from the meeting. This applies when the assignment is supposed to be an individual effort or whenever two teams discuss common problems they are each encountering (inter-group collaboration). After the meeting, engage in a half hour of mind-numbing activity (like watching an episode of Kyunki Saas Bhi Kabhi Bahu Thi), before starting to work on the assignment. This will assure that you are able to reconstruct what you learned from the meeting, by yourself, using your own brain.
• The Right to Information Rule: To assure that all collaboration is on the level, you must always write the name(s) of your collaborators on your assignment. This also applies when two groups collaborate.