Projections of the Mahalanobis norm

In this post \langle \cdot,\cdot \rangle is the standard inner product on \mathbb{R}^n for some n \geq 2 and A is a fixed Hermitian positive definite operator. The square Mahalanobis norm m of a vector v \in \mathbb{R}^n for this operator is defined by

\displaystyle m(v) = \langle v, A^{-1}v \rangle.

The results in this post were found while looking for ways to approximate this Mahalanobis norm without the need to invert A. (Later I realised that using the Cholesky factorisation suited me better. Nice results can be found by looking in the wrong places!) The idea is to use projections on some smaller dimensional subspace to get estimates of the actual Mahalanobis norm. To be precise let V \subseteq \mathbb{R}^n be some subspace of dimension 1 \leq d \leq n and let \pi_V be the orthogonal projection onto V. The operator \pi_V A is non-singular on the subspace V. Let A_V^{-1}:\mathbb{R}^n\to V be its pseudo inverse such that A_V^{-1}A\pi_V = \pi_V A A_V^{-1} = \pi_V. The projected Mahalanobis norm m_V on V is defined by

\displaystyle m_V(v) = \langle A_V^{-1} v, v \rangle.

Let’s take the one-dimensional case as an example. Let v \in \mathbb{R}^n be non-zero and denote the span of v by \llbracket v \rrbracket. Then the norm m_{\llbracket v \rrbracket} is given by

\displaystyle m_{\llbracket v \rrbracket}(v) = \frac{\langle v,v \rangle^2}{\langle A v, v \rangle}.

Note that this expression does not involve the inverse of A. The basic property of the projected Mahalanobis norm is the following:

The inequality m_V \leq m holds throughout V. Equality m_V(v) = m(v) occurs if and only if A_V^{-1}v = A^{-1}v.

This property follows from the Cauchy-Schwarz inequality for the inner product \langle \cdot, A^{-1} \cdot \rangle:

\displaystyle \langle A_V^{-1}v, v\rangle^2 = \langle A A_V^{-1}v, A^{-1} v\rangle^2 \leq \langle A A_V^{-1}v, A_V^{-1} v \rangle \langle v, A^{-1}v\rangle = \langle v, A_V^{-1} v \rangle \langle v, A^{-1}v\rangle.

This is an equality if and only if AA_V^{-1}v and v are linearly dependent. Combined with \pi_VAA_V^{-1}v = v it follows that in fact AA_V^{-1}v = v.

The following realisation came as a surprise. It shows that projections onto two-dimensional subspaces suffice to get an exact value for the Mahalanobis norm:

Let v\in \mathbb{R}^n be a non-zero vector and let V \subseteq \mathbb{R}^n be the span of \{v, A^{-1}v \} (so \dim(V) \leq 2). Then m_V(v) = m(v).

The projected norm for a two-dimensional subspace also has a simple explicit form. Let w \in \mathbb{R}^n be a non-zero vector orthogonal to v and let V be the span of \{v, w\}. The norm m_V is given by

\displaystyle m_V(v) = \frac{\langle v, v \rangle^2}{\langle A v, v\rangle - \frac{\langle A v, w \rangle^2}{\langle A w, w\rangle}}.

This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s