Differentials

This page is a sub-page of our page on Calculus of Several Real Variables.

///////

Related KMR pages:

Differentials of higher order

///////

Other relevant sources of information:

Differential infinitesimal
Differential forms

/////// Quoting Wikipedia (on Differentials_Infinitesimal):

The term differential is used in calculus to refer to an infinitesimal (infinitely small) change in some varying quantity. For example, if \, x \, is a variable, then a change in the value of \, x \, is often denoted \, \delta x \, (pronounced delta \, x ). The differential \, d x \, represents an infinitely small change in the variable \, x \, . The idea of an infinitely small or infinitely slow change is, intuitively, extremely useful, and there are a number of ways to make the notion mathematically precise.

/////// End of quote from Wikipedia.

Rules for computing with differentials:

\, d(f+g) = df + dg \,
\, d(\alpha f) = \alpha \, df \ \, , \, \alpha \in \mathbb{R}
\, d(f g) = f dg + g df \,
\, d(f(g)) = f'(g) dg \,

Example:

Let \, f = f(x, y) \, . Then we have

\, d(df) \, = \, d(\dfrac{\partial f}{\partial x} dx + \dfrac{\partial f}{\partial y} dy) \, = \, d( \dfrac{\partial f}{\partial x} ) dx + d( \dfrac{\partial f}{\partial y} ) dy \, = \,

\;\;\;\;\;\;\;\;\;\;\; = \, (\dfrac{{\partial}^2 f}{\partial x^2} dx + \dfrac{{\partial}^2 f}{\partial x \partial y} dy) dx + (\dfrac{{\partial}^2 f}{\partial y \partial x} dx + \dfrac{{\partial}^2 f}{\partial y^2} dy )dy \, = \,

\;\;\;\;\;\;\;\;\;\;\; = \, \dfrac{{\partial}^2 f}{\partial x^2} (dx)^2 + 2 \dfrac{{\partial}^2 f}{\partial x \partial y} dx dy + \dfrac{{\partial}^2 f}{\partial y^2} (dy)^2 \, = \,

\;\;\;\;\;\;\;\;\;\;\; = { \left( dx \dfrac{\partial}{\partial x} + dy \dfrac{\partial}{\partial y} \right) }^2 f .

NOTE: In this computation we have made use of the fact that,
if they are continuous, the mixed second partial derivatives are equal, that is

\, \dfrac{{\partial}^2 f}{\partial y \partial x} \, = \, \dfrac{{\partial}^2 f}{\partial x \partial y} .

Connections with linear algebra:

If the second partial derivatives are continuous, it follows that the Hessian matrix is symmetric, and therefore it has only real eigenvalues, and eigenvectors corresponding to different eigenvalues are orthogonal to each other. Moreover, any symmetric matrix is semi-simple.

/////// Quoting Wikipedia (on semi-simplicity):

If one considers vector spaces over a field, such as the real numbers, the simple vector spaces are those that contain no proper subspaces. Therefore, the one-dimensional vector spaces are the simple ones. So it is a basic result of linear algebra that any finite-dimensional vector space is the direct sum of simple vector spaces; in other words, all finite-dimensional vector spaces are semi-simple.

A square matrix (in other words a linear operator T : V → V, with V a finite dimensional vector space, is said to be simple if its only invariant subspaces under T are {0} and V. If the field is algebraically closed (such as the complex numbers), then the only simple matrices are of size 1 by 1. A semi-simple matrix is one that is similar to a direct sum of simple matrices; if the field is algebraically closed, this is the same as being diagonalizable.

/////// End of quote from Wikipedia

Leave a Reply