# 14: Differentiation of Functions of Several Variables

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

When dealing with a function of more than one independent variable, several questions naturally arise. For example, how do we calculate limits of functions of more than one variable? The definition of derivative we used before involved a limit. Does the new definition of derivative involve limits as well? Do the rules of differentiation apply in this context? Can we find relative extrema of functions using derivatives? All these questions are answered in this chapter.

- 14.0: Prelude to Differentiation of Functions of Several Variables
- Suppose, however, that we have a quantity that depends on more than one variable. For example, temperature can depend on location and the time of day, or a company’s profit model might depend on the number of units sold and the amount of money spent on advertising. Depending on the nature of the restrictions, both the method of solution and the solution itself changes.

- 14.1: Functions of Several Variables
- Our first step is to explain what a function of more than one variable is, starting with functions of two independent variables. This step includes identifying the domain and range of such functions and learning how to graph them. We also examine ways to relate the graphs of functions in three dimensions to graphs of more familiar planar functions.

- 14.2: Limits and Continuity
- We have now examined functions of more than one variable and seen how to graph them. In this section, we see how to take the limit of a function of more than one variable, and what it means for a function of more than one variable to be continuous at a point in its domain. It turns out these concepts have aspects that just don’t occur with functions of one variable.

- 14.3: Partial Derivatives
- Finding derivatives of functions of two variables is the key concept in this chapter, with as many applications in mathematics, science, and engineering as differentiation of single-variable functions. However, we have already seen that limits and continuity of multivariable functions have new issues and require new terminology and ideas to deal with them. This carries over into differentiation as well.

- 14.4: Tangent Planes and Linear Approximations
- In this section, we consider the problem of finding the tangent plane to a surface, which is analogous to finding the equation of a tangent line to a curve when the curve is defined by the graph of a function of one variable, y=f(x). The slope of the tangent line at the point x=ax=a is given by m=f'(a); what is the slope of a tangent plane? We learned about the equation of a plane in Equations of Lines and Planes in Space; in this section, we see how it can be applied to the problem at hand.

- 14.5: The Chain Rule for Multivariable Functions
- In single-variable calculus, we found that one of the most useful differentiation rules is the chain rule, which allows us to find the derivative of the composition of two functions. The same thing is true for multivariable calculus, but this time we have to deal with more than one form of the chain rule. In this section, we study extensions of the chain rule and learn how to take derivatives of compositions of functions of more than one variable.

- 14.6: Directional Derivatives and the Gradient
- A function \(z=f(x,y)\) has two partial derivatives: \(∂z/∂x\) and \(∂z/∂y\). These derivatives correspond to each of the independent variables and can be interpreted as instantaneous rates of change (that is, as slopes of a tangent line). Similarly, \(∂z/∂y\) represents the slope of the tangent line parallel to the y-axis. Now we consider the possibility of a tangent line parallel to neither axis.

- 14.7: Maxima/Minima Problems
- The application derivatives of a function of one variable is the determination of maximum and/or minimum values is also important for functions of two or more variables, but as we have seen in earlier sections of this chapter, the introduction of more independent variables leads to more possible outcomes for the calculations. The main ideas of finding critical points and using derivative tests are still valid, but new wrinkles appear when assessing the results.

- 14.8: Lagrange Multipliers
- Solving optimization problems for functions of two or more variables can be similar to solving such problems in single-variable calculus. However, techniques for dealing with multiple variables allow us to solve more varied optimization problems for which we need to deal with additional conditions or constraints. In this section, we examine one of the more common and useful methods for solving optimization problems with constraints.

- 14.E: Differentiation of Functions of Several Variables (Exercises)
- These are homework exercises to accompany Chapter 14 of OpenStax's "Calculus" Textmap.

*Thumbnail: Real function of two real variables. Image used with permission (Public Domain; Maschen).*

### Contributors

Gilbert Strang (MIT) and Edwin “Jed” Herman (Harvey Mudd) with many contributing authors. This content by OpenStax is licensed with a CC-BY 3/0 license. Download for free at http://cnx.org/contents/fd53eae1-fa2...49835c3c@5.191.