Newton-Cotes Formulas: The Master Recipe for Numerical Integration
Discover the single, elegant theory that generates the Trapezoidal Rule, Simpson's Rule, and more, and understand the deep connection between interpolation and integration.
The Quest for a Master Recipe
In our last guide, we explored several ways to find the area under a curve. We started with simple rectangles, graduated to the much better Trapezoidal Rule, and finally arrived at the incredibly powerful Simpson's Rule. They seemed like a collection of separate, clever tricks. But what if I told you they are not distinct methods at all? What if they are just the first few members of a single, unified family, all born from one "master recipe"?
That master recipe is the Newton-Cotes Quadrature Formula. It provides a universal framework for generating an entire family of integration rules. The core philosophy is brilliantly simple:
To approximate the integral of a complex function $f(x)$, we first replace $f(x)$ with a simpler function—a polynomialA polynomial is a friendly type of math expression with variables raised to positive whole-number powers, like 3x² + 5x - 7. They are very easy for computers to calculate.—that is easy to integrate.
By choosing polynomials of different degreesThe degree of a polynomial is the highest power of its variable. For example, a quadratic (x²) is a degree-2 polynomial. (lines, parabolas, cubics, etc.), we can generate a whole spectrum of integration formulas, each with a different level of accuracy and complexity. This masterclass will show you exactly how this recipe works.
The Derivation: From Interpolation to Integration
The foundation of the Newton-Cotes method is a topic we've already mastered: polynomial interpolation. The strategy is to take our complex function $f(x)$ and find an $n^{th}$-degree polynomial $P_n(x)$ that passes through $n+1$ points on its curve. Since $P_n(x)$ is a good approximation of $f(x)$, we can say:
$$ I = \int_a^b f(x) dx \approx \int_a^b P_n(x) dx $$The magic happens when we remember the formula for our interpolating polynomial, for example, the Lagrange formA method for directly constructing the unique polynomial that passes through a given set of data points. It is built from simpler "basis" polynomials.:
$$ P_n(x) = \sum_{i=0}^{n} y_i L_i(x) $$When we substitute this into our integral, something beautiful happens. Since $y_i$ (which is just $f(x_i)$) is a constant, we can pull it out of the integral:
The term in the parentheses, $\int_a^b L_i(x) dx$, looks complicated, but it's just a definite integral of a simple polynomial. It evaluates to a single number. We call these numbers the weights, $w_i$. This means that *every* Newton-Cotes formula is just a simple weighted sum of the function's values at different points:
$$ I \approx \sum_{i=0}^{n} w_i y_i = w_0 y_0 + w_1 y_1 + \dots + w_n y_n $$The only difference between the Trapezoidal Rule and Simpson's Rule lies in the values of these pre-calculated weights!
Example Derivation: The Trapezoidal Rule (n=1)
Let's see the recipe in action. For $n=1$, we approximate our function with a first-degree polynomial: a straight line connecting two points, $(x_0, y_0)$ and $(x_1, y_1)$. This is the Trapezoidal Rule. We find the integral of the Lagrange polynomial that passes through them. After the math is done (integrating the two basis polynomials), we find the weights are $w_0 = w_1 = \frac{h}{2}$, where $h$ is the step size. This gives us the familiar formula:
$$ I \approx \frac{h}{2}y_0 + \frac{h}{2}y_1 = \frac{h}{2}(y_0 + y_1) $$Example Derivation: Simpson's 1/3 Rule (n=2)
For $n=2$, we approximate our function with a second-degree polynomial: a parabola connecting three points, $(x_0, y_0), (x_1, y_1), (x_2, y_2)$. When we integrate the corresponding Lagrange polynomial, we find the weights are $w_0 = \frac{h}{3}$, $w_1 = \frac{4h}{3}$, and $w_2 = \frac{h}{3}$. This gives us the famous Simpson's 1/3 Rule formula:
$$ I \approx \frac{h}{3}y_0 + \frac{4h}{3}y_1 + \frac{h}{3}y_2 = \frac{h}{3}(y_0 + 4y_1 + y_2) $$The Newton-Cotes Family of Formulas
By continuing this process for higher-degree polynomials, we can generate a whole family of increasingly accurate integration rules. These are known as the **closed Newton-Cotes formulas** because they use the endpoints of the integration interval.
| Degree (n) | Name | Formula | Error Order |
|---|---|---|---|
| 1 | Trapezoidal Rule | $\frac{h}{2}(y_0 + y_1)$ | $O(h^3 f'')$ |
| 2 | Simpson's 1/3 Rule | $\frac{h}{3}(y_0 + 4y_1 + y_2)$ | $O(h^5 f^{(4)})$ |
| 3 | Simpson's 3/8 Rule | $\frac{3h}{8}(y_0 + 3y_1 + 3y_2 + y_3)$ | $O(h^5 f^{(4)})$ |
| 4 | Boole's Rule | $\frac{2h}{45}(7y_0 + 32y_1 + 12y_2 + 32y_3 + 7y_4)$ | $O(h^7 f^{(6)})$ |
Notice that Simpson's 1/3 Rule (using a parabola) is magically more accurate than expected—its error order is the same as the 3/8 Rule (using a cubic)! This makes it exceptionally efficient and is one of the reasons it's so popular.
The Newton-Cotes Generator
See the master recipe at work. Select the order of the polynomial (`n`) to see the corresponding integration rule, its weights, and its visual approximation of the area.
A Critical Warning: The Danger of High-Order Rules
Looking at the table, it's tempting to think that using a higher and higher degree polynomial will always lead to better and better results. This is a trap! As we learned in our masterclass on interpolation errors, high-degree polynomials that are forced through evenly-spaced points can suffer from a catastrophic problem known as Runge's PhenomenonThe problem where a high-degree interpolating polynomial for evenly-spaced points can have wild oscillations near the edges, leading to a very poor approximation..
These high-order polynomials can develop wild oscillations between the data points, leading to massive errors in the integral calculation. The formulas become numerically unstable as the weights for higher `n` have large magnitudes and alternating signs. For this reason, it is almost always better to use a composite rule. Instead of using one 10th-degree polynomial to cover a large interval, it is far more accurate and stable to split the interval into 5 segments and apply a simple 2nd-degree rule (Simpson's 1/3) to each segment.
Solving Numerical Problems
Problem 1: Using Simpson's 3/8 Rule
Question: Use Simpson's 3/8 rule to estimate the integral of $f(x) = e^x$ from $x=0$ to $x=3$.
Step 1: Setup
Simpson's 3/8 Rule requires $n=3$ intervals (4 points). The interval is $[0, 3]$. The step size is $h = (3-0)/3 = 1$. Our x-values are $x_0=0, x_1=1, x_2=2, x_3=3$.
The corresponding y-values ($y=e^x$) are:
$y_0=e^0=1.0$
$y_1=e^{1}\approx2.7183$
$y_2=e^{2}\approx7.3891$
$y_3=e^{3}\approx20.0855$
Step 2: Apply the Formula
The formula is $I \approx \frac{3h}{8} [y_0 + 3y_1 + 3y_2 + y_3]$.
$$ I \approx \frac{3(1)}{8} [1.0 + 3(2.7183) + 3(7.3891) + 20.0855] $$
$$ I \approx 0.375 [1.0 + 8.1549 + 22.1673 + 20.0855] = 0.375 [51.4077] \approx 19.2779 $$
Conclusion:
The true value is $e^3 - e^0 \approx 20.0855 - 1 = 19.0855$. Our approximation of 19.2779 is quite close with just four points.
No comments
Post a Comment