Assignment 3

Published

March 11, 2024

Abstract
Finite difference approximation of gradients.

Due: 20/03/2024

Problem Statement

We know that Taylor’s series is

\[ f(x_0 + \Delta x) \approx f(x_0) + \Delta x f'(x_0) + \frac{(\Delta x)^2}{2!} f''(x_0) + ...\] If we ignore the higher order terms then a linear approximation can be written as

\[ f(x_0 + \Delta x) \approx f(x_0) + \Delta x f'(x_0)\]

which gives,

\[ f'_{FD}(x_0) \approx \frac{f(x_0 + \Delta x) - f(x_0)}{\Delta x} \]

This is called as the finite difference approximation of a derivative. This is extensively used in scientific computing to approximate derivatives of functions. Clearly, as \(\Delta x \rightarrow 0\), right hand side tends to the analytical derivative \(f'(x_0)\). We want to investigate the accuracy of this approximation as we reduce \(\Delta x\).

Consider \(f'(x_0)\) for \(f(x) = \sin{x}\) at \(x_0 \in [\pi/16, \pi/4, 15\pi/16]\) and \(\Delta x = 10^{-\alpha},\ \alpha \in [1,..., 13]\). Error of finite difference (FD) approximation can be defined as

\[E_{FD} = \left \| 1 - \frac{f'_{FD}(x_0)}{f'(x_0)} \right \|.\]

Plot \(E_{FD}\) Vs. \(\alpha\) for all the values of \(x_0\). Use log scale for y axis.

Questions

  1. Does error reduce monotonically as \(\alpha\) increases?

  2. Is the \(\Delta x\) at which we obtain the minimum error remain constant for a given function or does it depend on the design point (\(x_0\)) at which the gradient is calculated?

  3. If we set \(f(x) = x^2 \sin{(x)} \cos{(x)}\), then can we expect the minimum \(E_{FD}\) at the same \(\Delta x\) for a given \(x_0\) ?

Deliverables

  • A pdf report with source code and a discussion the three questions.
Tip

Understand floating point arithmetic and search for subtractive cancellation error.