Lesson 1

Welcome to our lesson on **Newton's Method for Optimization**! This method helps us find the lowest point in a valley (minimum) or the highest peak on a mountain (maximum). By the end of this lesson, you'll understand Newton's Method, how it works, and how to use it in Python.

Imagine you're on a hike, looking for the lowest point in a valley. Newton's Method will guide you step-by-step to this point.

We're starting with a function $f(x)$ that we want to minimize. Let's use:

$f(x) = x^4 - 3x^3 + 2$Here's a plot of this function:

This plot shows the function's landscape with multiple local minima and maxima.

To minimize this function using **Newton's Method**, we start with an *initial guess*. Let's choose $x_0 = 3$. The choice here is simply random.

The red point shows our starting point. We will update this guess step by step, moving closer to the minimum.

**Newton's Method** updates our guess using the first and second derivatives. The update formula is:

$x_{n+1} = x_n - \frac{f'(x_n)}{f''(x_n)}$

For our function $f(x) = x^4 - 3x^3 + 2$:

- $f'(x) = 4x^3 - 9x^2$
- $f''(x) = 12x^2 - 18x$

**Important note:** the Newton's method is designed to find the critical point. It could be a minimum, maximum or a saddle point.

Let's implement Newton's Method in Python and see it in action.

Python`1def f_prime(x): 2 return 4*x**3 - 9*x**2 3 4def f_double_prime(x): 5 return 12*x**2 - 18*x 6 7def newtons_method(f_prime, f_double_prime, x0, max_iterations=10, tolerance=1e-6): 8 x = x0 9 steps = [x] # Track the optimization path 10 for _ in range(max_iterations): 11 f_prime_value = f_prime(x) 12 f_double_prime_value = f_double_prime(x) 13 14 if abs(f_prime_value) < tolerance: 15 break # Convergence criterion 16 17 x = x - f_prime_value / f_double_prime_value 18 steps.append(x) 19 20 return x, steps`

Here, we take multiple steps according to the formula above. We stop once the first derivative is very close to zero, indicating the minimum is reached. The code also defines the maximum amount of iterations. It is needed in case the minimum of the function doesn't exist or won't be found because the process will stuck in a loop.

Our function keeps track of all the steps, so we can plot it later.

Here is the full code that uses the Newton's method to minimize $f(x)$:

Python`1def f_prime(x): 2 return 4 * x ** 3 - 9 * x ** 2 3 4def f_double_prime(x): 5 return 12 * x ** 2 - 18*x 6 7 8def newtons_method(f_prime, f_double_prime, x0, max_iterations=10, tolerance=1e-6): 9 x = x0 10 steps = [x] # Track the optimization path 11 for _ in range(max_iterations): 12 f_prime_value = f_prime(x) 13 f_double_prime_value = f_double_prime(x) 14 15 if abs(f_prime_value) < tolerance: 16 break # Convergence criterion 17 18 x = x - f_prime_value / f_double_prime_value 19 steps.append(x) 20 21 return x, steps 22 23initial_guess = 3 24optimal_x, optimization_path = newtons_method(f_prime, f_double_prime, initial_guess) 25print("Optimal x using Newton's Method:", optimal_x) # Optimal x using Newton's Method: ~2.25`

As you can see, it correctly finds the mimimum of the function. Let's plot it!

One common problem of the optimization algorithm is that it can stuck in a local minimum. Let's look at this example:

Here, due to the poor choice of the initial guess, the algorithm didn't converge to the true minimum of the function.

There are several methods to address this issue:

**Multiple Initial Guesses**: Start the algorithm from different initial guesses to increase the chance of finding the global minimum.**Modify the Function**: Transform the original function to make the global minimum more prominent. For example, you can add a perturbation term to avoid local minima.

You've learned about Newton's Method for Optimization, how to update guesses using derivatives, and implemented it in Python. We used plots to visualize the steps.

Now, it's time to practice what you've learned. You'll use Newton's Method to optimize different functions and reinforce your understanding. Let's move to the practice section!