Lesson 3

Welcome to our lesson on **"Derivatives for Multivariable Functions"**! In machine learning and data science, understanding how changes in inputs affect outputs is crucial. This is where **derivatives** come into play, especially when dealing with functions that depend on multiple variables.

By the end of this lesson, you'll understand what **partial derivatives** are, how to calculate them, and why they are important for machine learning. You will also learn how to implement these concepts using Python. Imagine baking a cake and figuring out how changing the amount of sugar or flour affects the taste. **Partial derivatives** help answer such questions for functions with multiple inputs.

Let's start by discussing what **derivatives** are. A **derivative** measures how a function changes as its input changes. For single-variable functions, this means looking at a small change in $x$ and seeing how it affects $f(x)$. In other words, we calculate the speed of change of the function $f(x)$ at some specific point $x$.

In multivariable functions, we use **partial derivatives**. A **partial derivative** measures how a function changes as one specific variable changes while keeping the other variables constant.

Think about a multivariable function like $f(x, y)$. If we want to know how $f$ changes with respect to $x$, we compute the partial derivative of $f$ with respect to $x$, denoted as $\frac{\partial f}{\partial x}$. This is like asking, "If I change $x$ a little, how does $f$ change?" In other words, we calculate the speed of change of the function $f(x, y)$ along the $x$-axis. Similarly, for $y$, we compute $\frac{\partial f}{\partial y}$.

**Partial derivatives** are essential in machine learning, especially during the training of models. They help us understand the slope, or gradient, of error functions, guiding optimization algorithms like **Gradient Descent**.

Imagine trying to climb a mountain and wanting to know the steepness of your path. By understanding the slope in different directions, you can choose the easiest path to climb. Similarly, **partial derivatives** help guide the adjustments of model parameters to minimize errors.

To calculate a **partial derivative**, you take the derivative of the function while treating other variables as constants. In the previous course of this path we have seen examples of functions and their derivative. Though we omitted the calculation rules, it is important to remember that they exist and allow us to construct a function $f'(x)$, which is the derivative of $f(x)$.

Similarly, we can calculate derivative functions for the multivariable functions. Suppose $f(x, y) = x^2 + y^2$:

- The partial derivative with respect to $x$ is $\frac{\partial f}{\partial x} = 2x$.
- The partial derivative with respect to $y$ is $\frac{\partial f}{\partial y} = 2y$.

Here is a step-by-step breakdown:

- Identify the variable with respect to which you want to differentiate.
- Treat all other variables as constants.
- Differentiate with respect to the chosen variable.

Let's see how this can be implemented programmatically.

Here’s a Python code snippet that demonstrates how to compute partial derivatives using finite difference approximation:

Python`1# Partial derivative with respect to x 2def partial_derivative_x(f, x, y, h=1e-5): 3 return (f(x + h, y) - f(x, y)) / h 4 5# Partial derivative with respect to y 6def partial_derivative_y(f, x, y, h=1e-5): 7 return (f(x, y + h) - f(x, y)) / h 8 9# Sample function: f(x, y) = x^2 + y^2 10f = lambda x, y: x**2 + y**2 11 12# Compute partial derivatives at (1, 2) 13print("Partial derivative w.r.t x at (1, 2):", partial_derivative_x(f, 1, 2)) # Partial derivative w.r.t x at (1, 2): ~2 (might be a bit different due to the computational errors) 14print("Partial derivative w.r.t y at (1, 2):", partial_derivative_y(f, 1, 2)) # Partial derivative w.r.t y at (1, 2): ~4 (might be a bit different due to the computational errors)`

`partial_derivative_x`

calculates the partial derivative with respect to $x$ by perturbing $x$ by a small value $h$ and computing the difference quotient. Note that $y$ stays the same, because it is treated as a constant, and shouldn't be changed.`partial_derivative_y`

does the same for $y$, treating $x$ as a constant.

Great job! In this lesson, you learned what **partial derivatives** are, why they are important, and how to calculate them both theoretically and using Python. **Partial derivatives** help in understanding how changes in one variable affect a multivariable function, which is crucial for optimizing machine learning models.

Up next, you'll get to practice these concepts by solving various tasks using the CodeSignal IDE. Let's apply what you've learned and get more comfortable with calculating and interpreting **partial derivatives**. Happy coding!