Chapter 13: Functions of Multiple Variables and Partial Derivatives
( \newcommand{\kernel}{\mathrm{null}\,}\)
- Section 13.1: Functions of Multiple Variables
- Our first step is to explain what a function of more than one variable is, starting with functions of two independent variables. This step includes identifying the domain and range of such functions and learning how to graph them. We also examine ways to relate the graphs of functions in three dimensions to graphs of more familiar planar functions.
- Section 13.5: The Chain Rule for Multivariable Functions
- In single-variable calculus, we found that one of the most useful differentiation rules is the chain rule, which allows us to find the derivative of the composition of two functions. The same thing is true for multivariable calculus, but this time we have to deal with more than one form of the chain rule. In this section, we study extensions of the chain rule and learn how to take derivatives of compositions of functions of more than one variable.
- Section 13.6: Directional Derivatives and the Gradient
- A function z=f(x,y) has two partial derivatives: ∂z/∂x and ∂z/∂y. These derivatives correspond to each of the independent variables and can be interpreted as instantaneous rates of change (that is, as slopes of a tangent line). Similarly, ∂z/∂y represents the slope of the tangent line parallel to the y-axis. Now we consider the possibility of a tangent line parallel to neither axis.
- Section 13.8: Optimization of Functions of Several Variables
- The application derivatives of a function of one variable is the determination of maximum and/or minimum values is also important for functions of two or more variables, but as we have seen in earlier sections of this chapter, the introduction of more independent variables leads to more possible outcomes for the calculations. The main ideas of finding critical points and using derivative tests are still valid, but new wrinkles appear when assessing the results.
- Section 13.10: Lagrange Multipliers
- Solving optimization problems for functions of two or more variables can be similar to solving such problems in single-variable calculus. However, techniques for dealing with multiple variables allow us to solve more varied optimization problems for which we need to deal with additional conditions or constraints. In this section, we examine one of the more common and useful methods for solving optimization problems with constraints.