Lesson 1
Newton's Method for Optimization
Lesson Introduction

Welcome to our lesson on Newton's Method for Optimization! This method helps us find the lowest point in a valley (minimum) or the highest peak on a mountain (maximum). By the end of this lesson, you'll understand Newton's Method, how it works, and how to use it in Python.

Imagine you're on a hike, looking for the lowest point in a valley. Newton's Method will guide you step-by-step to this point.

Task Setup: Function to Minimize

We're starting with a function f(x)f(x) that we want to minimize. Let's use:

f(x)=x43x3+2f(x) = x^4 - 3x^3 + 2

Here's a plot of this function:

This plot shows the function's landscape with multiple local minima and maxima.

General Approach with Initial Guess

To minimize this function using Newton's Method, we start with an initial guess. Let's choose x0=3x_0 = 3. The choice here is simply random.

The red point shows our starting point. We will update this guess step by step, moving closer to the minimum.

Updating the Guess Using Newton's Method

Newton's Method updates our guess using the first and second derivatives. The update formula is:

xn+1=xnf(xn)f(xn)x_{n+1} = x_n - \frac{f'(x_n)}{f''(x_n)}

For our function f(x)=x43x3+2f(x) = x^4 - 3x^3 + 2:

  • f(x)=4x39x2f'(x) = 4x^3 - 9x^2
  • f(x)=12x218xf''(x) = 12x^2 - 18x

Important note: the Newton's method is designed to find the critical point. It could be a minimum, maximum or a saddle point.

Python Implementation and Optimization Path: Part 1

Let's implement Newton's Method in Python and see it in action.

Python
1def f_prime(x): 2 return 4*x**3 - 9*x**2 3 4def f_double_prime(x): 5 return 12*x**2 - 18*x 6 7def newtons_method(f_prime, f_double_prime, x0, max_iterations=10, tolerance=1e-6): 8 x = x0 9 steps = [x] # Track the optimization path 10 for _ in range(max_iterations): 11 f_prime_value = f_prime(x) 12 f_double_prime_value = f_double_prime(x) 13 14 if abs(f_prime_value) < tolerance: 15 break # Convergence criterion 16 17 x = x - f_prime_value / f_double_prime_value 18 steps.append(x) 19 20 return x, steps

Here, we take multiple steps according to the formula above. We stop once the first derivative is very close to zero, indicating the minimum is reached. The code also defines the maximum amount of iterations. It is needed in case the minimum of the function doesn't exist or won't be found because the process will stuck in a loop.

Our function keeps track of all the steps, so we can plot it later.

Python Implementation and Optimization Path: Part 2

Here is the full code that uses the Newton's method to minimize f(x)f(x):

Python
1def f_prime(x): 2 return 4 * x ** 3 - 9 * x ** 2 3 4def f_double_prime(x): 5 return 12 * x ** 2 - 18*x 6 7 8def newtons_method(f_prime, f_double_prime, x0, max_iterations=10, tolerance=1e-6): 9 x = x0 10 steps = [x] # Track the optimization path 11 for _ in range(max_iterations): 12 f_prime_value = f_prime(x) 13 f_double_prime_value = f_double_prime(x) 14 15 if abs(f_prime_value) < tolerance: 16 break # Convergence criterion 17 18 x = x - f_prime_value / f_double_prime_value 19 steps.append(x) 20 21 return x, steps 22 23initial_guess = 3 24optimal_x, optimization_path = newtons_method(f_prime, f_double_prime, initial_guess) 25print("Optimal x using Newton's Method:", optimal_x) # Optimal x using Newton's Method: ~2.25

As you can see, it correctly finds the mimimum of the function. Let's plot it!

Potential Pitfalls

One common problem of the optimization algorithm is that it can stuck in a local minimum. Let's look at this example:

Here, due to the poor choice of the initial guess, the algorithm didn't converge to the true minimum of the function.

There are several methods to address this issue:

  1. Multiple Initial Guesses: Start the algorithm from different initial guesses to increase the chance of finding the global minimum.
  2. Modify the Function: Transform the original function to make the global minimum more prominent. For example, you can add a perturbation term to avoid local minima.
Lesson Summary

You've learned about Newton's Method for Optimization, how to update guesses using derivatives, and implemented it in Python. We used plots to visualize the steps.

Now, it's time to practice what you've learned. You'll use Newton's Method to optimize different functions and reinforce your understanding. Let's move to the practice section!

Enjoy this lesson? Now it's time to practice with Cosmo!
Practice is how you turn knowledge into actual skills.