# Horn’s inequality for singular values via exterior algebra

$$\DeclareMathOperator{\tr}{tr} \DeclareMathOperator{\sgn}{sgn} \DeclareMathOperator{\Id}{Id}$$Horn’s inequality states that for any two compact operators $$\sigma,\tau$$ on a Hilbert space $$E$$, $$\prod_{k=1}^n s_k(\sigma\tau) \le \prod_{k=1}^n s_k(\sigma)s_k(\tau)$$ where $$s_1(\tau),s_2(\tau),\dots$$ are the singular values of $$\tau$$ arranged in descending order. Alfred Horn’s original 1950 paper provides a short proof that relies on the following result:

# Line integrals: 3. Applications to complex analysis


# Line integrals: 2. Locally exact forms and singular homology

Navigation: 1. Exact, conservative and closed forms | 2. Locally exact forms and singular homology | 3. Applications to complex analysis As in part 1, $$E,F$$ are Banach spaces and $$U\subseteq E$$ is an open set. Recall that a form $$\omega:U\to L(E,F)$$ is exact if $$\omega=Df$$ for some $$f:U\to F$$, closed if it is differentiable […]

# Line integrals: 1. Exact, conservative and closed forms

Navigation: 1. Exact, conservative and closed forms | 2. Locally exact forms and singular homology | 3. Applications to complex analysis The line integral is a useful tool for working with vector fields on $$\mathbb{R}^n$$, (co)vector fields on manifolds, and complex differentiable functions. However, it is often unclear how these different versions of the line […]

# First-order ODEs, matrix exponentials, and det(exp)

Last time we derived a formula for the derivative of the matrix exponential. Here we will be focusing instead on the expression $$D\exp(x)u=\exp(x)u=u\exp(x),$$ which holds whenever $$u$$ commutes with $$x$$. In this post, $$E$$ denotes a real Banach space and $$L(E)$$ denotes the space of continuous linear operators on $$E$$. ODEs The (matrix) exponential can […]

# Fréchet derivative of the (matrix) exponential function

$$D\exp(x)u = \int_0^1 e^{sx}ue^{(1-s)x}\,ds.$$ This intriguing formula expresses the derivative of the exponential map on a Banach algebra as an integral. In particular, using “matrix calculus” notation we have the formula $$d\exp(X)= \int_0^1 e^{sX}(dX)e^{(1-s)X}\,ds$$ when $$X$$ is a square matrix. As we’ll see, this is not too hard to prove.

# Convex functions, second derivatives and Hessian matrices

In single variable calculus, a twice differentiable function $$f:(a,b)\to\mathbb{R}$$ is convex if and only if $$f^{\prime\prime}(x)\ge 0$$ for all $$x\in(a,b)$$. It is not too hard to extend this result to functions defined on more general spaces:

# Differentiation done correctly: 5. Maxima and minima

Navigation: 1. The derivative | 2. Higher derivatives | 3. Partial derivatives | 4. Inverse and implicit functions | 5. Maxima and minima In this final post, we are going to look at some applications of differentiation to locating maxima and minima of real valued functions. In order to do this, we will be using […]