In numerical analysis, numerical differentiation describes algorithms for estimating the derivative of a mathematical function using values of the function and perhaps other knowledge about the function.
There are mainly two ways to calculate the derivative of a function. (And useful when finding Jacobian matrix)
Finite difference formulas (Newton’s difference quotient)
This way is the same as what we learn in high school course. That is to compute the slope of a nearby secant line with a tiny tiny increament h:$$\dfrac {f(x+h)-f(x)}{h}$$
And in implementation, we just need to make sure that the h is very very small.
The above formula is also called the Newton’s forward approximation when $h > 0$. Since it uses the point that are forward to the current point. If $h<0$, then it is called Newton’s backward approximation.
If we don’t use the current point and chooce one tiny step both forward and backward to calculate the derivative: $\dfrac {f(x+h)-f(x-h)}{2h}$, then it is called central approximation. And as the error item in this way is proportional to $h^2$, for small values of h, the approximation is more accurate than one-sided(forward/backward) estimation.
Complex-step approximation
The classical finite difference approximations for numerical differentiation are ill-conditioned. However, if f is a holomorphic function, real-valued on the real line, which can be evaluated at points in the complex plane near x then there are stable methods. The first derivative can be calculated by the complex-step derivative formula as follows: $$f’(x) = \frac{Im(f(x+ih))}{h}$$
where $i$ is the imaginary unit, h is the small step size.
The derivation can be as follows:
- Let $f(x)$ be a function, let $x_0$ be a point on the real axis, and let $h$ be a real parameter.
- Expand $f(x)$ in a Taylor series off the real axis: $$f(x_0+ih) = f(x_0) = ihf’(x_0) - h^2f’’(x_0)/2! - ih^3F^{(3)}/3! + \cdots$$
- Take the imaginary part of both sides and divide by h: $$f’(x_0) = \frac {Im(f(x_0+ih))}{h} + O(h^2)$$
- Therefore, the above gives an approximation to the value of the derivative, $f′(x_0)$, and that is accurate to order $O(h^2)$.
$h = 1.0E-20$ usually works very well in nearly all cases, and results in derivatives no less accurate than the original algorithm.
Note that the above formula is only valid for calculating a first-order derivative.
Differential quadrature
Differential quadrature is the approximation of derivatives by using weighted sums of function values. The name is in analogy with quadrature meaning Numerical integration where weighted sums are used in methods such as Simpson’s method or the Trapezium rule. There are various methods for determining the weight coefficients. Differential quadrature is used to solve partial differential equations.
Reference:
https://wiki2.org/en/Numerical_differentiation
https://blogs.mathworks.com/cleve/2013/10/14/complex-step-differentiation/
http://mdolab.engin.umich.edu/content/guide-complex-step-derivative-approximation-0