Daily Chess Puzzle – Train your tactical vision with fresh puzzles. Click any card, think, and then reveal the solution in the post body.

Golden Section Search and Newton’s method

Optimization: A Masterclass on Golden Section Search & Newton's Method
A winding path leading down into a valley, representing optimization

Optimization: A Masterclass on Golden Section Search & Newton's Method

Discover two powerful strategies for finding the lowest point in a valley—the slow and steady hiker versus the rocket-powered daredevil.

The Quest for the Best: An Introduction to Optimization

In our previous guides, we were detectives hunting for "zero"—the points where a function crosses the x-axis. This is called root finding. But often in science, business, and engineering, we face a different, arguably more important challenge: finding the function's absolute best value. This could be a maximum (like maximizing profit) or a minimum (like minimizing cost or error). This entire field is known as optimization.

Imagine you are an engineer trying to find the angle for a robotic arm that uses the least amount of energy to complete a task. Or a data scientist finding the perfect parameter that minimizes a machine learning model's prediction error. In both cases, you are hunting for a minimum value.

This masterclass explores two legendary algorithmsAn algorithm is a set of step-by-step instructions for solving a problem or accomplishing a task. for one-dimensional, unconstrained optimization. They represent two fundamentally different philosophies for finding the bottom of a valley:

  • Golden Section Search: A safe, reliable bracketingBracketing methods work by keeping the solution trapped within an ever-shrinking interval. They are slow but guaranteed to work. method that slowly but surely corners the minimum. It's the wise hiker who zig-zags down the valley, always keeping the lowest point in sight.
  • Newton's Method: A blazing-fast open methodOpen methods use a single guess (or two) to project a new, better guess. They are extremely fast but can sometimes fail spectacularly if the initial guess is poor. that uses calculus to take daring leaps towards the minimum. It's the daredevil in a wingsuit, diving straight for the bottom.

Understanding both is key to becoming a master of numerical methods, as each has its own strengths, weaknesses, and ideal use cases.


Golden Section Search: Nature's Optimal Strategy

The Golden Section Search is the elegant and more flexible cousin of the Fibonacci Search. It's a bracketing method that is guaranteed to find the minimum of a function, provided that the function is unimodal.

A unimodalFrom 'uni' (one) and 'modus' (mode). A unimodal function has only one valley (a minimum) within a given interval. It consistently decreases, hits the minimum, and then consistently increases. function is one that, within our chosen interval $[a, b]$, has only one valley. This property is vital because it allows us to discard parts of our search interval with absolute confidence, knowing the minimum cannot possibly be in the section we're throwing away.

The Power of the Golden Ratio

What makes this method so special is its use of the famous Golden Ratio, $\phi$ (phi), which is approximately 1.618. This irrational number has fascinated mathematicians for centuries and appears unexpectedly in nature, art, and architecture.

In our case, it provides the mathematically optimal constant ratio to shrink our search interval at every step. Instead of pre-calculating a list of Fibonacci numbers, we simply place two interior test points, $x_1$ and $x_2$, using a fixed proportion related to the Golden Ratio. The key is that after we discard a section, one of our old test points magically becomes a new test point in the new, smaller interval, meaning we only need **one new function evaluation per iteration**. This makes the method incredibly efficient.

The Golden Section Search Visualizer

Let's find the minimum of $f(x) = (x-3)^2 + 1$ on the interval $[0, 10]$. Press "Next Iteration" to see how the interval (the green shaded region) shrinks by a "golden" proportion each time.

Iterabx1x2f(x1)f(x2)

Newton's Method for Optimization: The Rocket-Powered Descent

If Golden Section Search is the reliable hiker, Newton's Method is the rocket-powered daredevil. It doesn't bother with bracketing; it uses the power of calculus to take huge, intelligent leaps directly toward the minimum.

A New Goal, A New Formula

We know from calculus that the minimum (or maximum) of a function occurs where its slope is zero. In other words, we are looking for the point $x$ where the first derivative is zero: $f'(x) = 0$.

This brilliantly transforms our optimization problem into a root-finding problem! We can now use the Newton's Method for root finding that we learned previously, but instead of applying it to our original function $f(x)$, we apply it to its derivative, $f'(x)$.

Recall the original Newton's formula for a function $g(x)$: $x_{n+1} = x_n - \frac{g(x_n)}{g'(x_n)}$. If we set our target function to be $g(x) = f'(x)$, then its derivative is $g'(x) = f''(x)$ (the second derivative). Substituting these in gives us the powerful Newton's Method formula for optimization:

$$x_{n+1} = x_n - \frac{f'(x_n)}{f''(x_n)}$$

The intuition is beautiful: the first derivative ($f'$) tells us the slope (are we going downhill?), and the second derivative ($f''$) tells us the curvatureCurvature measures how sharply a curve is bending. For optimization, the second derivative tells us if our valley is wide and flat or narrow and steep. (how steep is the valley?). By using both pieces of information, the algorithm creates a parabolic model of the valley at its current guess and then jumps directly to the bottom of that parabola.

Convergence and Catastrophic Failure

When Newton's method works, it exhibits blazing-fast quadratic convergence—the number of correct digits roughly doubles at every step. However, it's a high-risk, high-reward strategy. It can fail spectacularly if:

  • The second derivative $f''(x)$ is zero (an inflection pointA point on a curve where the curvature changes sign (from concave up to concave down, or vice versa). At this point, the second derivative is zero.), causing division by zero.
  • The initial guess is poor, causing the algorithm to diverge and shoot off to infinity.
  • It accidentally converges to a maximum instead of a minimum if it starts in a region where the function is concave down.


Solving Numerical Problems: A Step-by-Step Guide

Problem 1: Manual Golden Section Search

Question: Perform the first two iterations of the Golden Section Search to find the minimum of $f(x) = x^2 - 4x + 5$ on the interval $[0, 5]$.

Step 1: Setup
The golden ratio constant is $\phi \approx 1.618$. The reduction factor is $R = \phi - 1 \approx 0.618$. The initial interval is $[a, b] = [0, 5]$, with length $L = 5$.

Iteration 1:
Calculate the distance $d = R \times L = 0.618 \times 5 \approx 3.09$.
Place the interior points:

  • $x_1 = b - d = 5 - 3.09 = 1.91$
  • $x_2 = a + d = 0 + 3.09 = 3.09$
Evaluate the function at these points:
  • $f(x_1) = f(1.91) = (1.91)^2 - 4(1.91) + 5 \approx 1.008$
  • $f(x_2) = f(3.09) = (3.09)^2 - 4(3.09) + 5 \approx 2.188$
Since $f(x_1) < f(x_2)$, the minimum must be in the left sub-interval. Our new interval becomes $[0, x_2]$ which is [0, 3.09].

Iteration 2:
Our new interval is $[0, 3.09]$ with length $L \approx 3.09$.
Our old $x_1$ at 1.91 is now the new $x_2$ for this interval. We only need to calculate one new point.
The new distance is $d = R \times L \approx 0.618 \times 3.09 \approx 1.91$.
Our new $x_1$ is $b - d = 3.09 - 1.91 = 1.18$. So, our two interior points are now $x_1 = 1.18$ and $x_2 = 1.91$.

Problem 2: Manual Newton's Method for Optimization

Question: Perform the first two iterations of Newton's Method to find the minimum of $f(x) = e^x - 4x$ starting at $x_0 = 1$.

Step 1: Find the Derivatives
The first derivative is $f'(x) = e^x - 4$.
The second derivative is $f''(x) = e^x$.

Step 2: Iteration 1
The formula is $x_{n+1} = x_n - \frac{f'(x_n)}{f''(x_n)}$. Our start is $x_0=1$.

  • $f'(1) = e^1 - 4 \approx 2.718 - 4 = -1.282$
  • $f''(1) = e^1 \approx 2.718$
$$ x_1 = 1 - \frac{-1.282}{2.718} \approx 1 - (-0.471) = 1.471 $$

Step 3: Iteration 2
Our new guess is $x_1 = 1.471$.

  • $f'(1.471) = e^{1.471} - 4 \approx 4.354 - 4 = 0.354$
  • $f''(1.471) = e^{1.471} \approx 4.354$
$$ x_2 = 1.471 - \frac{0.354}{4.354} \approx 1.471 - 0.0813 = 1.3897 $$

Result: After just two iterations, we have rapidly moved from 1 to ~1.39. The true minimum is at $x = \ln(4) \approx 1.386$, so we are already extremely close.


Test Your Intuition!

Optimization Quiz

No comments

No comments yet. Be the first!

Post a Comment

Search This Blog

Explore More Topics

Loading topics…