Now we turn to one of the most common and important applications of interpolants: finding derivatives of functions. Because differentiation is a linear operation, we will constrain ourselves to formulas that are linear in the nodal values.
Note that while (5.4.1) is about finding the derivative at a single point x, the same formula can be applied for different x. The usual situation is a regularly spaced grid of nodes, a,a+h,a+2h,…,b, and then the value of f at each node takes part in multiple applications of the formula. This will be demonstrated in Example 5.4.1 below.
As pointed out in Example 5.4.1, the only real distinction between (5.4.2) and (5.4.3) is whether we think that f′ is being evaluated at the left node or the right one. Symmetry would suggest that we should evaluate it halfway between. That is the motivation behind centered difference formulas.
Let’s derive the shortest centered formula using p=q=1. For simplicity, we will set x=0 without affecting the result. This means that f(−h), f(0), and f(h) are all available in (5.4.1).
Note that (5.4.2) is simply the slope of the line through the points (0,f(0)) and (h,f(h)). One route to using all three function values is to differentiate the quadratic polynomial that interpolates (−h,f(−h)) as well (see Exercise 1):
This result is equivalent to (5.4.1) with p=q=1 and weights a−1=−21, a0=0, and a1=21. Observe that while the value of f(0) was available during the derivation, its weight ends up being zero.
Besides the aesthetic appeal of symmetry, in Convergence of finite differences we will see another important advantage of (5.4.8) compared to the one-sided formulas.
We can in principle derive any finite-difference formula from the same process: Interpolate the given function values, then differentiate the interpolant exactly. Some results of the process are given in Table 5.4.1 for centered differences, and in Table 5.4.2 for forward differences. Both show the weights for estimating the derivative at x=0. To get backward differences, you change the signs and reverse the order of the coefficients in any row of Table 5.4.2; see Exercise 2.
Table 5.4.1:Weights for centered finite-difference formulas.
order
−4h
−3h
−2h
−h
0
h
2h
3h
4h
2
−21
0
21
4
121
−32
0
32
−121
6
−601
203
−43
0
43
−203
601
8
2801
−1054
51
−54
0
54
−51
1054
−2801
Table 5.4.2:Weights for forward finite-difference formulas. To get backward differences, change the signs and reverse the order of the coefficients.
order
0
h
2h
3h
4h
1
-1
1
2
−23
2
−21
3
−611
3
−23
31
4
−1225
4
-3
34
−41
The main motivation for using more function values in a formula is to improve the accuracy. This is measured by order of accuracy, which is shown in the tables and explored in Section 5.5.
Many applications require the second derivative of a function. It’s tempting to use the finite difference of a finite difference. For example, applying (5.4.8) to f′ gives
This is a valid formula, but it uses values at ±2h rather than the closer values at ±h. A better and more generalizable tactic is to return to the quadratic Q(x) in (5.4.7) and use Q′′(0) to approximate f′′(0). Doing so yields
which is the simplest centered second-difference formula. As with the first derivative, we can choose larger values of p and q in (5.4.1) to get new formulas, such as
For the second derivative, converting a forward difference to a backward difference requires reversing the order of the weights, while not changing their signs.
Although function values at equally spaced nodes are a common and convenient situation, the node locations may be arbitrary. The general form of a finite-difference formula is
We no longer assume equally spaced nodes, so there is no “h” to be used in the formula. As before, the weights may be applied after any translation of the independent variable. The weights again follow from the interpolate/differentiate recipe, but the algebra becomes complicated. Fortunately there is an elegant recursion known as Fornberg’s algorithm that can calculate these weights for any desired formula. We present it without derivation as Function 5.4.1.
(a) Show that Q(x) interpolates the three values of f at x=−h, x=0, and x=h.
(b) Show that Q′(0) gives the finite-difference formula defined by (5.4.8).
only - Unknown Directive
(a) ✍ Table 5.4.2 lists forward difference formulas in which p=0 in (5.4.1). Show that the change of variable g(x)=f(−x) transforms these formulas into backward difference formulas with q=0, and write out the table analogous to Table 5.4.2 for backward differences.
(b) ⌨ Suppose you are given the nodes t0=0.9, t1=1, and t2=1.1, and f(x)=sin(2x). Using formulas from Table 5.4.1 and Table 5.4.2, compute second-order accurate approximations to f′ at each of the three nodes.
only - Unknown Directive
⌨ Let f(x)=e−x, x=0.5, and h=0.2. Using Function 5.4.1 to get the necessary weights on five nodes centered at x, find finite-difference approximations to the first, second, third, and fourth derivatives of f. Make a table showing the derivative values and the errors in each case.
⌨ In the manner of Demo 5.4.5, use Function 5.4.1 on centered node vectors of length 3, 5, 7, and 9 to produce a table analogous to Table 5.4.1 for the second derivative f′′(0). (You do not need to show the orders of accuracy, just the weights.)
⌨ For this problem, let f(x)=tan(2x).
(a) ⌨ Apply Function 5.4.1 to find a finite-difference approximation to f′′(0.3) using the five nodes tj=0.3+jh for j=−2,…,2 and h=0.05. Compare to the exact value of f′′(0.3).
(b) ⌨ Repeat part (a) for f′′(0.75) on the nodes tj=0.75+jh. Why is the finite-difference result so inaccurate? (Hint: A plot of f might be informative.)
✍ Find the finite-difference formula for f′′(0) that results from applying (5.4.2) on f′ and then (5.4.3) on f′ within that result.
(a) ✍ Show using L’Hôpital’s Rule that the centered formula approximation (5.4.8) converges to an equality as h→0.
(b) ✍ Derive two conditions on the finite-difference weights in (5.4.1) that arise from requiring convergence as h→0. (Hint: Consider what is required in order to apply L’Hôpital’s Rule, as well as the result of applying it.)