In vector calculus, the Jacobian matrix is the matrix of all first-order partial derivatives of a vector-valued function. Its importance lies in the fact that it represents the best linear approximation to a differentiable function near a given point.
The Jacobian matrix is named after the mathematician Carl Gustav Jacobi; the term "Jacobian" is pronounced as "yah-KO-bee-un".
Suppose F : Rn → Rm is a function from Euclidean n-space to Euclidean m-space. Such a function is given by m real-valued component functions, y1(x1,...,xn), ..., ym(x1,...,xn). The partial derivatives of all these functions (if they exist) can be organized in an m-by-n matrix, the Jacobian matrix of F, as follows:
If p is a point in Rn and F is differentiable at p, then its derivative is given by JF(p) (and this is the easiest way to compute said derivative). In this case, the linear map described by JF(p) is the best linear approximation of F near the point p, in the sense that
Table of contents |
2 Jacobian determinant 3 Example |
Example
The Jacobian matrix of the function F : R3 → R4 with components:
is:Jacobian determinant
If m = n, then F is a function from n-space to n-space and the Jacobi matrix is a square matrix. We can then form its determinant, known as the Jacobian determinant.
The Jacobian determinant at a given point gives important information about the behavior of F near that point. For instance, the continuously differentiable function F is invertible near p if and only if the Jacobian determinant at p is non-zero. This is the inverse function theorem. Furthermore, if the Jacobian determinant at p is positive, then F preserves orientation near p; if it is negative, F reverses orientation. The absolute value of the Jacobian determinant at p gives us the factor by which the function F expands or shrinks volumes near p; this is why it occurs in the general substitution rule.