**Content:** Conditioning and stability, floating-point arithmetic. *Numerical linear algebra*: Vector and matrix norms, singular value decomposition, QR factorization, LU and Cholesky factorizations, conjugate gradient method, eigenvalue algorithms. *Nonlinear problems:* root finding, interpolation, numerical integration and differentiation, unconstrained optimization.

**Objectives:** At the end of the course, students will be able to:

- Analyze the conditioning of numerical problems and the stability of numerical algorithms
- State and derive the stability properties, computational complexity, and applicability criteria of standard numerical algorithms
- Identify the numerical algorithms best suited for given problems
- Solve complex problems by designing and implementing appropriate combinations of numerical algorithms

**Textbooks:**

- Heath,
*Scientific Computing: An Introductory Survey*, 2nd Ed. - Trefethen and Bau,
*Numerical Linear Algebra* - Boyd and Vandenberghe,
*Convex Optimization*

**Prerequisites:** COL106 or equivalent. Overlaps with MTL704. Familiarity with linear algebra and calculus is assumed.

- Mon, 3 Jan: Introduction, conditioning (Heath 1.1–1.2.4, 1.2.6, Trefethen & Bau 12 up to “Relative Condition Number”)
- Practice problems: Heath exercises 1.1–1.5, computer problem 1.1

- Thu, 6 Jan: Floating-point arithmetic, accuracy and stability (Trefethen & Bau 12–14)

- Mon, 10 Jan: More about stability, linear algebra (T&B 15, 1 up to “Range and Nullspace”)
- Practice problems: T&B exercises 15.1, 1.1–2

- Thu, 13 Jan: Linear algebra, orthogonality (TB 1, 2)
- Practice problems: T&B exercises 1.3–4, 2.*

- Mon, 17 Jan: Norms, SVD (TB 3, 4 up to “Reduced SVD”)
- Practice problems: T&B exercises 3.*

- Thu, 20 Jan: More about SVD, projections (TB 4, 5, 6)
- Practice problems: T&B exercises 4.* (use
`scipy.linalg.svd`

for 4.3), 5.*, 6.* - Further reading: Ranade, “Some uses of spectral methods”

- Practice problems: T&B exercises 4.* (use

- Mon, 24 Jan: QR factorization (T&B 7, 8, 9)
- Practice problems: T&B exercises 7.*, 8.*, 9.* (all programming exercises should be done in Python using Numpy, Scipy, and Matplotlib)

- Thu, 27 Jan: Householder triangularization, least-squares problems (T&B 10, 16, 11)
- Practice problems: T&B exercises 10.*, 16.*, 11.* (use
`scipy.linalg.solve`

for Matlab’s`\`

) - Further reading: T&B 17 (“Stability of Back Substitution”)

- Practice problems: T&B exercises 10.*, 16.*, 11.* (use

- Mon, 31 Jan: Conditioning of least-squares, LU factorization (T&B 18 up to Theorem 18.1, 20, 21)
- Practice problems: T&B exercises 18.1–2, 18.3, 20.*, 21.*

- Thu, 3 Feb: Stability of LU, Cholesky factorization, iterative methods (T&B 22, 23, 32)
- Practice problems: T&B exercises 22.1–3, 23.* (in 23.3, think about how you could implement a function
`solve(A,b)`

that performs similarly to Matlab’s`A\b`

), 32.* - Further reading: Higham, “What Is the Growth Factor for Gaussian Elimination?”, 2020; Spielman and Teng, “Smoothed Analysis: An Attempt to Explain the Behavior of Algorithms in Practice”, 2009

- Practice problems: T&B exercises 22.1–3, 23.* (in 23.3, think about how you could implement a function

- Mon, 7 Feb: Arnoldi, GMRES (T&B 33, 35)
- Practice problems: T&B exercises 33.1–2, 35.*

- Thu, 10 Feb: Conjugate gradients (T&B 38)
- Practice problems: T&B exercises 38.*
- Further reading: Shewchuk, “An Introduction to the Conjugate Gradient Method Without the Agonizing Pain”, 1994

- Mon, 21 Feb: Eigenvalues (T&B 24, 25, 26)
- Practice problems: T&B exercises 24.*, 25.*, 26.* (use
`scipy.linalg.expm`

for the matrix exponential, and`matplotlib.pyplot.contour`

for contour plotting) - Further reading: Page et al., “The PageRank Citation Ranking: Bringing Order to the Web”, 1999

- Practice problems: T&B exercises 24.*, 25.*, 26.* (use
- Thu, 24 Feb: Eigenvalue algorithms (T&B 27, 28)
- Practice problems: T&B exercises 27.*, 28.* (look up
*convex hull*on p. 24 of Boyd & Vandenberghe)

- Practice problems: T&B exercises 27.*, 28.* (look up

- Thu, 3 Mar:
~~Conditioning of eigenvalues,~~nonlinear equations (Heath~~4.3,~~5.1–5.5)- Practice problems: Heath review questions 5.5, 5.9–14

- Mon, 7 Mar: Nonlinear systems of equations (H 5.5.3–4, 5.6)
- Practice problems: H review questions 5.15–17, 5.23–24, exercises 5.1–6, 5.8–14, computer problems 5.27–29
- Further reading: Grinshpan, “The order of convergence for the secant method”

- Thu, 10 Mar: Interpolation (H 7.1–7.3.3)
- Practice problems: H review questions 7.8–17, exercises 7.3–6, computer problems 7.1–2

- Sat, 12 Mar: Piecewise polynomial interpolation, numerical integration (H 7.3.5–7.4.2, 8.1–8.2)
- Practice problems: H review questions 7.18–20, exercises 7.7, 7.14–15, computer problems 7.4–6 (use
`scipy.interpolate.CubicSpline`

) - Further reading: Akima, “A New Method of Interpolation and Smooth Curve Fitting Based on Local Procedures”, 1970; Fritsch and Carlson, “Monotone Piecewise Cubic Interpolation”, 1980

- Practice problems: H review questions 7.18–20, exercises 7.7, 7.14–15, computer problems 7.4–6 (use

- Mon, 14 Mar: Numerical integration (H 8.3–8.3.3)
- Practice problems: H exercises 8.1, 8.2(a), 8.3–4, 8.7–10
- Further reading: Trefethen, “Six Myths of Polynomial Interpolation and Quadrature”, 2011

- Thu, 17 Mar: Numerical integration, differentiation, optimization (H 8.3.5–6, 8.6, Boyd & Vandenberghe 1)
- Practice problems: Heath exercises 8.11–13, computer problems 8.1(a,c), 8.2, 8.18 (look up the relevant routines in Scipy, e.g.
`scipy.integrate.quad`

) - Further reading: Heath 8.6, “Richardson Extrapolation”

- Practice problems: Heath exercises 8.11–13, computer problems 8.1(a,c), 8.2, 8.18 (look up the relevant routines in Scipy, e.g.

- Mon, 21 Mar: Convex sets and functions (B&V 2.1–3, 3.1–2)
- Practice problems: B&V exercises 2.1–16, 3.1–12, 3.15–22, 3.28–32
- Further reading: B&V Appendix A.4, “Derivatives”

- Thu, 24 Mar: Optimization (B&V 4.1–2, 9.1–2)
- Practice problems: B&V exercises 9.1-3, 9.5

- Mon, 28 Mar: Descent methods (B&V 9.3–4)
- Practice problems: B&V 9.6, 9.8

- Thu, 31 Mar: Newton’s method (B&V 9.5)
- Practice problems: B&V 9.(4,10,11,24,25)
- Further reading: H 6.5.5, “Secant Updating Methods”; Heath 6.6, “Nonlinear Least Squares”

- Homework assignments: 40% (dates for future assignments are tentative)
- Assignment 1: 13 Jan – 27 Jan
- Assignment 2: 27 Jan –
~~10 Feb~~15 Feb - Assignment 3 (helper code): 15 Feb –
~~3 Mar~~5 Mar - Assignment 4: 3 Mar –
~~17 Mar~~21 Mar - Assignment 5:
~~17 Mar~~18 Mar – 31 Mar

- Minor:
~~25%~~20% - Major:
~~35%~~40%

Students should use Python 3 with Numpy/Scipy for the programming component of the homework. If you do not have prior experience with these libraries, please go through one of the following tutorials to familiarize yourself:

- “Getting started with Python for science” from the
*Scipy Lecture Notes* - “Python Numpy Tutorial” from the
*Stanford CS231n notes*

**Grading:** Following institute policy, a minimum of 80% marks are required for an A grade, and minimum 30% marks for D.

**Late policy:** Homework assignments are due at midnight on the due date. You are allowed a total of 5 late days across all the assignments. A maximum of 2 late days can be used for each assignment. Any assignment submitted late after the total allowed late days have been used will not be graded.

**Audit policy:** A minimum of 40% marks is required for audit pass.

**Attendance policy:** Attendance lower than 50% may result in a one-grade penalty (e.g. A to A–, or A– to B).

**Collaboration policy:** Adapted from Dan Weld’s guidelines, via Mausam:

Collaboration is a very good thing. On the other hand, cheating is considered a very serious offense. Please don’t do it! Concern about cheating creates an unpleasant environment for everyone. If you cheat, you get a zero in the assignment, and additionally you risk losing your position as a student in the department and the institute. The department’s policy on cheating is to report any cases to the disciplinary committee. What follows afterwards is not fun.

So how do you draw the line between collaboration and cheating? Here’s a reasonable set of ground rules. Failure to understand and follow these rules will constitute cheating, and will be dealt with as per institute guidelines.

The Kyunki Saas Bhi Kabhi Bahu Thi Rule:This rule says that you are free to meet with fellow students(s) and discuss assignments with them. Writing on a board or shared piece of paper is acceptable during the meeting; however, you should not take any written (electronic or otherwise) record away from the meeting. This applies when the assignment is supposed to be an individual effort or whenever two teams discuss common problems they are each encountering (inter-group collaboration). After the meeting, engage in a half hour of mind-numbing activity (like watching an episode ofKyunki Saas Bhi Kabhi Bahu Thi), before starting to work on the assignment. This will assure that you are able to reconstruct what you learned from the meeting, by yourself, using your own brain.

The Right to Information Rule:To assure that all collaboration is on the level, you must always write the name(s) of your collaborators on your assignment. This also applies when two groups collaborate.