## First-order ODEs, matrix exponentials, and det(exp)

Last time we derived a formula for the derivative of the matrix exponential. Here we will be focusing instead on the expression $$D\exp(x)u=\exp(x)u=u\exp(x),$$ which holds whenever $$u$$ commutes with $$x$$. In this post, $$E$$ denotes a real Banach space and $$L(E)$$ denotes the space of linear operators on $$E$$.

## ODEs

The (matrix) exponential can be used to solve certain types of first-order linear systems of ordinary differential equations with non-constant coefficients: not only can we solve \begin{align}x’(t)&=x(t)+y(t) \\ y’(t)&=x(t)+y(t),\end{align} but we can also solve \begin{align}x’(t)&=tx(t)+y(t) \\ y’(t)&=x(t)+ty(t).\end{align}

## Fréchet derivative of the (matrix) exponential function

$$D\exp(x)u = \int_0^1 e^{sx}ue^{(1-s)x}\,ds.$$ This intriguing formula expresses the derivative of the exponential map on a Banach algebra as an integral. In particular, using “matrix calculus” notation we have the formula $$d\exp(X)= \int_0^1 e^{sX}(dX)e^{(1-s)X}\,ds$$ when $$X$$ is a square matrix. As we’ll see, this is not too hard to prove.

## Convex functions, second derivatives and Hessian matrices

In single variable calculus, a twice differentiable function $$f:(a,b)\to\mathbb{R}$$ is convex if and only if $$f^{\prime\prime}(x)\ge 0$$ for all $$x\in(a,b)$$. It is not too hard to extend this result to functions defined on more general spaces:

## Differentiation done correctly: 5. Maxima and minima

Navigation: 1. The derivative | 2. Higher derivatives | 3. Partial derivatives | 4. Inverse and implicit functions | 5. Maxima and minima

In this final post, we are going to look at some applications of differentiation to locating maxima and minima of real valued functions. In order to do this, we will be using Taylor’s theorem (covered in part 2) to prove the higher derivative test for functions on Banach spaces, and the implicit function theorem (covered in part 4) to prove a special case of the method of Lagrange multipliers.

## Differentiation done correctly: 4. Inverse and implicit functions

Navigation: 1. The derivative | 2. Higher derivatives | 3. Partial derivatives | 4. Inverse and implicit functions | 5. Maxima and minima

Now we’re going to prove the inverse function and implicit function theorems for Banach spaces.

## Differentiation done correctly: 3. Partial derivatives

Navigation: 1. The derivative | 2. Higher derivatives | 3. Partial derivatives | 4. Inverse and implicit functions | 5. Maxima and minima

While we saw that differentiable maps may be naturally split into component functions when the codomain is a product of Banach spaces, the situation for the domain is more complicated. (This is partly due to the fact that as a topological space, there is no natural injection into a product of Banach spaces.) In this post, we will look at how the existence of partial derivatives relates to differentiability, how the symmetry of higher derivatives (covered in part 2) affects mixed partial derivatives, and finally a short proof of differentiation under the integral sign.

## Differentiation done correctly: 2. Higher derivatives

Navigation: 1. The derivative | 2. Higher derivatives | 3. Partial derivatives | 4. Inverse and implicit functions | 5. Maxima and minima

Last time, we covered the definition of the derivative and its basic properties, which all turn out to be quite similar to their single variable counterparts. Now we are going to explore higher derivatives. In traditional multivariable calculus, true higher derivatives do not exist (except in a specific situation which will be discussed in part 5). Of course, we have so-called “mixed/higher partial derivatives”, which are coordinate-dependent and notationally tricky to work with. As a consequence, the usual statement of Taylor’s theorem in $$\mathbb{R}^n$$ ends up being ugly and hard to remember. In reality, Taylor’s theorem for Banach spaces looks almost exactly the same as the single variable Taylor’s theorem!

## Differentiation done correctly: 1. The derivative

Navigation: 1. The derivative | 2. Higher derivatives | 3. Partial derivatives | 4. Inverse and implicit functions | 5. Maxima and minima

In multivariable calculus, one often encounters nonsensical equations such as the following:

## Coursera first impressions

I recently signed up for a few Coursera courses, and overall it’s been a fun experience. Here are some of my comments.

## Game Theory

So far the instructors have explained normal form games, Nash equilibrium, Pareto optimality, mixed strategies and maxmin strategies. The definitions are pretty clear and plenty of examples follow. The graded problem sets are usually fairly easy and sometimes include interesting examples not found in the videos. However, there are no proofs of any theorems or results. In fact, the theorems themselves are not even written down in any precise way – the instructors seem to be intent on avoiding mathematical notation here for some reason. Verdict: Enrol.

## The Modern World: Global History since 1760

Starting near the end of the Commercial Revolution, the instructor explores global history all the way up to the present. There are usually one or two questions at the end of each video to determine whether you have been paying attention. The instructor (Philip Zelikow from the University of Virginia) really shows his enthusiasm for the subject in the videos. Verdict: Enrol!

## Image and video processing: From Mars to Hollywood with a stop at the hospital

The instructor begins by giving many applications of image processing, and then gets straight into topics like Huffman coding and the discrete cosine transform (no proofs). I haven’t watched a lot of them yet. Verdict: ???

## Introduction to Finance

The instructor is clearly excited about the subject, but keeps going off on tangents. Furthermore, he takes around an hour to explain the compound interest formula $$P(1+r/n)^{nt}$$, something that should take 5 minutes at most. Use Khan Academy instead. Verdict: Avoid.

## Analytic Combinatorics, Part I

Finally, a course in which the forums aren’t full of complaints about “too much math”. Sedgewick’s voice can be a little boring at times, but the content should be extremely interesting to anyone who knows about generating functions. Verdict: Enrol!