Akshay Shankar
March 19, 2021
Given a function \(f(x)\), how to find \(f'(x)\) at \(x = x_0\)?
The most straightforward way is:
\[\frac{df(x)}{dx} \equiv \frac{f(x+\Delta x) - d(x)}{\Delta x}\]
with the added condition that \(\Delta x \to 0\).
More formally we can use the Taylor expansion:
\[f(x + \Delta x) = f(x) + \Delta x \cdot f'(x) + \mathcal{O}(\Delta x ^2)\]
\[\implies f'(x) \approx \frac{f(x+\Delta x) - f(x)}{\Delta x}\]
“Forward difference” formula, where error drops as \(~ \Delta x^2\)
Other forms are also possible;
Backward difference: \[f'(x) \approx \frac{f(x) - f(x - \Delta x)}{\Delta x}\]
Central difference: \[f'(x) \approx \frac{f(x+\Delta x / 2) - f(x - \Delta x / 2)}{\Delta x}\]
Second derivatives can be calculated in a similar way.
\[f''(x) \approx \frac{f'(x + \Delta x) - f'(x)}{\Delta x}\] \[ f''(x) \approx \frac{(f(x + \Delta x) - f(x)) - (f(x) - f(x - \Delta x))}{\Delta x ^2} \]
\[f''(x) \approx \frac{f(x + \Delta x) - 2\cdot f(x) + f(x - \Delta x)}{\Delta x ^2}\]
Let us write the following taylor expansions:
\[f(x + 2\Delta x) = f(x) + (2\Delta x)f'(x) + (2\Delta x)^2 \frac{f''(x)}{2!} + (2\Delta x)^3 \frac{f'''(x)}{3!} + \mathcal{O}(\Delta x^4)\]
\[f(x + \Delta x) = f(x) + (\Delta x)f'(x) + (\Delta x)^2 \frac{f''(x)}{2!} + (\Delta x)^3 \frac{f'''(x)}{3!} + \mathcal{O}(\Delta x^4)\]
\[f(x - \Delta x) = f(x) - (\Delta x)f'(x) + (\Delta x)^2 \frac{f''(x)}{2!} - (\Delta x)^3 \frac{f'''(x)}{3!} + \mathcal{O}(\Delta x^4)\]
\[f(x - 2\Delta x) = f(x) - (2\Delta x)f'(x) + (2\Delta x)^2 \frac{f''(x)}{2!} - (2\Delta x)^3 \frac{f'''(x)}{3!} + \mathcal{O}(\Delta x^4)\]
Multiply each equation by a parameter: a, b, c, d and add them
The LHS simply has a linear combination of \(f(x_i)\)
\[af(x+2\Delta x) + bf(x+\Delta x) + cf(x - \Delta x) + df(x - 2\Delta x)\]
The RHS has four terms;
\[(a+b+c+d) f(x)\] \[(2a+b-c-2d) f'(x)\] \[(2a + \frac{1}{2}b + \frac{1}{2}c + 2d)f''(x) \] \[(\frac{4}{3}a + \frac{1}{6}b - \frac{1}{6}c - \frac{4}{3}d)f'''(x)\]
\[\begin{equation} \begin{pmatrix} 1 & 1 & 1 & 1\\ 2 & 1 & -1 & -2\\ 2 & \frac{1}{2} & \frac{1}{2} & 2\\ \frac{4}{3} & \frac{1}{6} & -\frac{1}{6} & -\frac{4}{3} \end{pmatrix} \cdot \begin{pmatrix} a\\ b\\ c\\ d \end{pmatrix}=\begin{pmatrix} ?\\ ?\\ ?\\ ? \end{pmatrix} \end{equation}\]
\[H \cdot m = C\]
Set C to different vectors to obtain results!
Interpolation!
\[\begin{equation}C = \begin{pmatrix}1& 0& 0& 0\end{pmatrix} \end{equation}\]
First derivative!
\[\begin{equation}C = \begin{pmatrix}0& 1& 0& 0\end{pmatrix} \end{equation}\]
and so on.
Space and time are continuous and have infinite extent.
Computer memory on the other hand…
Not so much.
How do we simulate physical situations and equations governing them on a computer?
By discretizing spatial co-ordinates and time.
Suppose we have some physical process which changes a quantity \(y\) in the following way.
\[\frac{\partial y}{\partial t} = c \frac{\partial y}{\partial x}\]
Discretize the spatial co-ord \(x\) between \([x_0, x_N]\)
\[x \equiv \{x_0, x_0 + \Delta x, ..., x_N\}\] \[x \equiv x_0 + i\Delta x \ \ \ \forall i \in [0, n-1]\]
Divide into equally spaced points. Similarly for time.
Finite differences comes into play now.
\[x \equiv x_i\]
\[t \equiv t_j\]
\[y(x, t) \equiv y(x_i, t_j) \equiv y_i^j\]
\[y(x + a\Delta x, t + b\Delta t) \equiv y(x_{i+a}, t_{j+b}) \equiv y_{i+a}^{j+b}\]
Discretized equation will now be
\[\frac{y_i^{j+1} - y_i^j}{\Delta t} = c \cdot \frac{y_{i+1}^j - y_i^j}{\Delta x}\]
\[ y_i^{j+1} = y_i^j + \frac{c\cdot \Delta t}{\Delta x} \cdot (y_{i+1}^j - y_i^j) \]
We now have an equation of the form:
\[y_i^{j+1} = Ay_i^j + By_{i+1}^j\]
Recursively apply this to step forward in time.
Can we be smarter? Linear algebra subroutines are well optimized in most languages..
Let us think of this in a new way. Suppose, we divided space into \(n\) discrete points, we will have \(n\) values of y.
This means we can interpret \(y^j\) (at a time \(t_j\)) as a column matrix, or a ‘vector’: \[\begin{equation}
y^j \equiv \begin{pmatrix}
y_0^j\\
y_1^j\\
.\\
.\\
.\\
y_{n-1}^j
\end{pmatrix}
\end{equation}\]
This equation \(y_i^{j+1} = Ay_i^j + By_{i+1}^j\) can be neatly cast into matrix form.
\[\begin{equation} \begin{pmatrix} y_0^{j+1}\\ y_1^{j+1}\\ .\\ .\\ .\\ y_{n-1}^{j+1} \end{pmatrix} = \begin{pmatrix} A& B & 0 & . & . & . & .\\ 0& A & B & 0 & . & . & .\\ .& 0 & A & B & 0 & . & .\\ .& . & . & . & . & . & .\\ .& . & . & . & . & . & .\\ . & . & . & . & 0 & A & B\\ . & . & . & . & . & 0 & A \end{pmatrix} \cdot \begin{pmatrix} y_0^j\\ y_1^j\\ .\\ .\\ .\\ y_{n-1}^j \end{pmatrix} \end{equation}\]
For example, the second order derivative we found earlier: \[f''(x) \approx \frac{f(x + \Delta x) - 2\cdot f(x) + f(x - \Delta x)}{\Delta x ^2}\]
This operation of performing this second derivative can be represented as a matrix operator!
\[\begin{equation} D = \frac{1}{(\Delta x)^2} \begin{pmatrix} -2&1&0&.&.&.&.&.\\ 1&-2&1&0&.&.&.&.\\ 0&1&-2&1&0&.&.&.\\ .&.&.&.&.&.&.&.\\ .&.&.&.&.&.&.&.\\ .&.&.&0&1&-2&1&0\\ .&.&.&.&0&1&-2&1\\ .&.&.&.&.&0&1&-2 \end{pmatrix} \end{equation}\]
\[D\vec{f} = \vec{f'}\]