Partial Derivative
What is a partial derivative?
The partial derivative of a function of two variables f(x,y) at a point (x,y) with respect to x is defined as the limit $$ \frac{d \: f(x,y)}{d \: x} = \lim_{h \rightarrow 0 } \frac{f(x+h,y)-f(x,y)}{h} $$ Likewise, the partial derivative at (x,y) with respect to y is $$ \frac{d \: f(x,y)}{d \: y} = \lim_{h \rightarrow 0 } \frac{f(x,y+h)-f(x,y)}{h} $$
If the limit exists and is finite, we say that the partial derivative exists at that point.
How to compute a partial derivative
To compute the partial derivative with respect to one variable, treat all other variables as constants.
Then differentiate the function with respect to the chosen variable using standard differentiation rules.
Note. If the function depends on more than two variables - for example, f(x, y, z) - treat the remaining variables as constants and differentiate with respect to the one of interest.
Partial derivatives are often denoted using various notations:
$$ D_x \ f \ \ \ \ D_y \ f $$
$$ ∂_x \ f \ \ \ \ ∂_y \ f $$
$$ \frac{∂ \ f}{dx} \ \ \ \ \frac{∂ \ f}{dy} $$
or simply fx and fy.
What if the function depends on three variables?
When a function depends on multiple variables, the partial derivative is taken with respect to one of them, treating all the others as constants.
For instance, the partial derivative of f(x, y, z) with respect to x is:
$$ \frac{d \ f(x,y,z)}{dx} = \lim_{\Delta x \rightarrow 0 } \frac{f(x+\Delta x, \ y, \ z)-f(x,y,z) }{ \Delta x } $$
What does a partial derivative tell us?
The partial derivative with respect to x describes the rate of change - or slope - of the function in the x-direction.
Note. In the same way, you can compute the partial derivatives with respect to y and z: $$ \frac{d \ f(x,y,z)}{dy} = \lim_{\Delta y \rightarrow 0 } \frac{f(x, \ y+\Delta y, \ z)-f(x,y,z)}{ \Delta y } $$ $$ \frac{d \ f(x,y,z)}{dz} = \lim_{\Delta z \rightarrow 0 } \frac{f(x, \ y, \ z +\Delta z )-f(x,y,z)}{ \Delta z } $$ These represent the slopes of the function in the y- and z-directions, respectively.
In general, a function of \( n \) variables will have \( n \) partial derivatives - one for each independent variable.
A practical example
Consider the function f(x, y), which depends on two independent variables:
$$ f(x,y) = x^2 + y^3 $$
Let’s compute the partial derivative of f(x, y) with respect to x.
We treat y as a constant:
$$ D_x f(x,y) = D_x[x^2 + y^3] = 2x $$
Note. Since y3 is treated as a constant with respect to x, its derivative is zero: Dx[y3] = 0.
Now let’s differentiate f(x, y) with respect to y, treating x as a constant:
$$ D_y f(x,y) = D_y[x^2 + y^3] = 3y^2 $$
So, the partial derivatives of f(x, y) are:
$$ D_x f(x,y) = 2x $$
$$ D_y f(x,y) = 3y^2 $$
Note. In this example, the function f(x, y) is differentiable with respect to both variables. Therefore, the partial derivatives Dx and Dy together form the gradient: $$ D \: f = [ D_x , D_y ] $$
Additional Examples of Partial Derivatives
Here are more examples of first-order partial derivatives for functions of two variables.
f(x, y) | \( f_x \) | \( f_y \) |
---|---|---|
\( x^2+y^3 \) | \( 2x \) | \( 3y^2 \) |
\( x^3y + y^3x \) | \( 3x^2y + y^3 \) | \( x^3 + 3y^2x \) |
\( x^2 + 3xy + y^2 \) | \( 2x + 3y \) | \( 3x + 2y \) |
\( e^x \sin(y) \) | \( e^x \sin(y) \) | \( e^x \cos(y) \) |
\( \sin(xy^2) \) | \( \cos(xy^2) \cdot y^2 \) | \( \cos(xy^2) \cdot 2xy \) |
\( e^{xy} \cos y \) | \( e^{xy} \cdot y \cos y \) | \( x \cdot e^{xy} \cos y - e^{xy} \sin y \) |
Since \( x \) appears only in the first term, the second term is treated as constant when differentiating with respect to \( x \). | As \( y \) appears in both terms, the product rule must be applied when differentiating with respect to \( y \). | |
\( \ln(x^2 + y^2) \) | \( \frac{2x}{x^2 + y^2} \) | \( \frac{2y}{x^2 + y^2} \) |
\( \log(1+x^2+y^4) \) | \( \frac{2x}{1+x^2+y^4} \) | \( \frac{4y^3}{1+x^2+y^4} \) |
\( \frac{1}{x^2 + y^2} \) | \( \frac{-2x}{(x^2 + y^2)^2} \) | \( \frac{-2y}{(x^2 + y^2)^2} \) |
\( \arctan\left(\frac{y}{x}\right) \) | \( \frac{-y}{x^2 + y^2} \) | \( \frac{x}{x^2 + y^2} \) |
\( x^y \) | \( y x ^{y-1} \) | \( x^y \log x \) |
This is analogous to differentiating \( x^k \), where \( k \) is constant. | This is similar to differentiating \( k^y \), with \( k \) treated as constant. | |
\( \sqrt{1 + x^2 + y^2} \) | \( \frac{x}{\sqrt{1 + x^2 + y^2}} \) | \( \frac{y}{\sqrt{1 + x^2 + y^2}} \) |
\( \frac{xy}{x^2 + y^2} \) | \( \frac{y(y^2 - x^2)}{(x^2 + y^2)^2} \) | \( \frac{x(x^2 - y^2)}{(x^2 + y^2)^2} \) |
Now let’s look at some partial derivatives involving functions of three variables.
\( f(x, y, z) \) | \( f_x \) | \( f_y \) | \( f_z \) |
---|---|---|---|
\( x^2 + y^2 + z^2 \) | \( 2x \) | \( 2y \) | \( 2z \) |
\( xyz \) | \( yz \) | \( xz \) | \( xy \) |
\( e^{x+y+z} \) | \( e^{x+y+z} \) | \( e^{x+y+z} \) | \( e^{x+y+z} \) |
\( \sin(xy + z) \) | \( \cos(xy + z) \cdot y \) | \( \cos(xy + z) \cdot x \) | \( \cos(xy + z) \) |
\( \ln(x^2 + y^2 + z^2) \) | \( \frac{2x}{x^2 + y^2 + z^2} \) | \( \frac{2y}{x^2 + y^2 + z^2} \) | \( \frac{2z}{x^2 + y^2 + z^2} \) |
\( x^y + z^x \) | \( yx^{y-1} + \log(z) z^x \) | \( x^y \log x \) | \( x z^{x-1} \) |
\( \sqrt{1 + x^2 + y^2 + z^2} \) | \( \frac{x}{\sqrt{1 + x^2 + y^2 + z^2}} \) | \( \frac{y}{\sqrt{1 + x^2 + y^2 + z^2}} \) | \( \frac{z}{\sqrt{1 + x^2 + y^2 + z^2}} \) |
\( \frac{xz}{y^2 + 1} \) | \( \frac{z}{y^2 + 1} \) | \( \frac{-2xyz}{(y^2 + 1)^2} \) | \( \frac{x}{y^2 + 1} \) |
\( \arctan\left(\frac{y + z}{x}\right) \) | \( \frac{-(y+z)}{x^2 + (y+z)^2} \) | \( \frac{x}{x^2 + (y+z)^2} \) | \( \frac{x}{x^2 + (y+z)^2} \) |
\( x^y z \) | \( yx^{y-1}z \) | \( x^y \log x \cdot z \) | \( x^y \) |
Partial derivative of a product involving a variable exponent in \( x \) | The logarithmic rule is used when differentiating with respect to \( y \) | Since \( x^y \) is constant with respect to \( z \), it remains unchanged |
The same rules and reasoning apply when dealing with functions of \( n \) variables.
Notes
Additional insights and remarks on partial derivatives.
- Partial derivatives do not imply continuity or differentiability
The existence of partial derivatives alone does not guarantee that a function is continuous or differentiable.Example. Consider the function: \[
f(x,y) =
\begin{cases}
\displaystyle \frac{2xy}{x^2+y^2} & \text{if} \ (x,y) \neq (0,0) \\
0 & \text{if} \ (x,y) = (0,0)
\end{cases}
\] The partial derivatives of this function at \((x,y) = (0,0)\) exist and are both zero, since the derivative of a constant (zero) with respect to \(x\) or \(y\) is always zero: \[ \frac{\partial f}{\partial x}(0,0) = 0 \quad \text{and} \quad \frac{\partial f}{\partial y}(0,0) = 0 \] However, the function is not continuous at \((0,0)\) because the limit of \( f(x,y) \) as \((x,y)\) approaches \((0,0)\) depends on the path taken. For instance, approaching along the \(x\)-axis (\(y = 0\)) yields: \[ \lim_{(x,y) \rightarrow (0,0)} f(x,0) = \lim_{(x,y) \rightarrow (0,0)} \frac{2x \cdot 0}{x^2 + 0} = 0 \] Similarly, approaching along the \(y\)-axis (\(x = 0\)) also gives: \[ \lim_{(x,y) \rightarrow (0,0)} f(0,y) = \lim_{(x,y) \rightarrow (0,0)} \frac{2 \cdot 0 \cdot y}{0 + y^2} = 0 \] So far, everything seems consistent. However, approaching along the line \( y = x \) gives: \[ \lim_{(x,y) \rightarrow (0,0)} f(x,x) = \lim_{(x,y) \rightarrow (0,0)} \frac{2x^2}{2x^2} = 1 \] Thus, the limit \(\lim_{(x,y) \to (0,0)} f(x,y)\) does not exist, as it depends on the direction of approach. Whenever the limit fails to exist - or fails to match the value of the function at the point - the function is not continuous there. In conclusion, the partial derivatives exist at \((0,0)\), but the function itself is not continuous at that point. - Higher-order partial derivatives
Higher-order partial derivatives are obtained by differentiating multiple times: \[ \frac{\partial^2 f}{\partial x_i \partial x_k} \] They are called "pure" derivatives when differentiation is repeated with respect to the same variable, and "mixed" derivatives when taken with respect to different variables. \[ \frac{\partial^2 f}{\partial x_i \partial x_k}, \quad D^2_{x_i x_k} f, \quad f_{x_k x_i} \]Example. Let's consider the function: \[ f(x,y) = xy^2 \] First, we compute the first-order partial derivatives: with respect to \(x\): \[ \frac{\partial f}{\partial x} = y^2 \] and with respect to \(y\): \[ \frac{\partial f}{\partial y} = 2xy \] Now, the pure second-order derivatives: the second derivative with respect to \(x\) twice is: \[ \frac{\partial^2 f}{\partial x^2} = \frac{\partial}{\partial x}(y^2) = 0 \] and with respect to \(y\) twice: \[ \frac{\partial^2 f}{\partial y^2} = \frac{\partial}{\partial y}(2xy) = 2x \] Next, we compute the mixed second-order derivatives. Differentiating first with respect to \(x\), then \(y\): \[ \frac{\partial^2 f}{\partial y \partial x} = \frac{\partial}{\partial y}\left(\frac{\partial f}{\partial x}\right) = \frac{\partial}{\partial y}(y^2) = 2y \] or first with respect to \(y\), then \(x\): \[ \frac{\partial^2 f}{\partial x \partial y} = \frac{\partial}{\partial x}\left(\frac{\partial f}{\partial y}\right) = \frac{\partial}{\partial x}(2xy) = 2y \] As we can see, the mixed second-order derivatives are equal. This occurs because the second-order mixed derivatives are continuous, consistent with Schwarz’s theorem.
- Schwarz’s Theorem
For a function of two or more variables, if the mixed second-order partial derivatives are continuous, the order of differentiation can be interchanged: \[ \frac{\partial^2 f}{\partial x_i \partial x_k} = \frac{\partial^2 f}{\partial x_k \partial x_i} \]
In other words, changing the order of differentiation yields the same result. Thanks to this, the number of second-order derivatives that must be computed reduces from \( n^2 \) to \( \frac{n(n+1)}{2} \), where \(n\) is the number of variables. This result applies only to functions of two or more variables. In functions of a single variable, the question does not arise since there is only one variable.
Example. Consider a simple function of two variables: \[ f(x,y) = x^2y + 3xy^2 \] The first-order partial derivatives are: \[ \frac{\partial f}{\partial x} = 2xy + 3y^2 \] \[ \frac{\partial f}{\partial y} = x^2 + 6xy \] The pure second-order derivatives are: \[ \frac{\partial^2 f}{\partial x^2} = \frac{\partial}{\partial x}(2xy + 3y^2) = 2y \] \[ \frac{\partial^2 f}{\partial y^2} = \frac{\partial}{\partial y}(x^2 + 6xy) = 6x \] The mixed second-order derivatives are: \[ \frac{\partial^2 f}{\partial y \partial x} = \frac{\partial}{\partial y}(2xy + 3y^2) = 2x + 6y \] \[ \frac{\partial^2 f}{\partial x \partial y} = \frac{\partial}{\partial x}(x^2 + 6xy) = 2x + 6y \] As expected from Schwarz’s theorem, since the second-order mixed derivatives are continuous, swapping the order of differentiation produces the same result. This greatly reduces the computational effort: instead of computing four second derivatives, we only need to compute three.
And so forth.