next up previous contents
: Optimization : Maths : 目次   目次


Matrix Math


Singular Value Decomposition

The operation decomposes an $ M \times N$ matrix (with $ M > N$) $ \varmathbb{A}$ into the product of a $ M \times N$ orthogonal matrix $ \varmathbb{U}$, an $ N
\times N$ diagonal matrix of singular values, $ \varmathbb{S}$ and the transpose of an $ N
\times N$ orthogonal square matrix $ \varmathbb{V}$, thus

$\displaystyle \varmathbb{A} = \varmathbb{U} \varmathbb{S} \varmathbb{V}^T
$

Some features of this decomposition include


Cholesky Factorization

Given a matrix $ \varmathbb{A}$ which is symmetric and positive definite we can do a LU decomposition to give

$\displaystyle \varmathbb{A} = \varmathbb{L} \varmathbb{U}$

However arranging such that $ \varmathbb{U} = \varmathbb{L}^T$ we get $ \varmathbb{A} = \varmathbb{L} \varmathbb{L}^T$ which is the Cholesky factorization of $ A$. The elements of $ L$ can be easily determined by equating components. The utility of a Cholesky factorization is that it convert the linear system $ \varmathbb{A}
\varmathbb{X} = \varmathbb{B}$ into two triangular systems which can be solved by backward and forward substitutions. For the 2D case

$\displaystyle \left( \begin{array}{cc}
a_{11} & a_{12} \\
a_{21} & a_{22} \end...
...t)
\left( \begin{array}{cc}
l_{11} & l_{21} \\
0 & l_{22} \end{array} \right)
$

This gives us
$\displaystyle l_{11}$ $\displaystyle =$ $\displaystyle \sqrt{a_{11}}$ (1)
$\displaystyle l_{21}$ $\displaystyle =$ $\displaystyle \frac{a_{22} }{ \sqrt{a_{11}}}$ (2)
$\displaystyle l_{22}$ $\displaystyle =$ $\displaystyle \sqrt{a_{22} - \frac{a^{2}_{21}}{a_{11}}}$ (3)

Projection of Vectors onto a Subspace

Given, $ \mathbf{b}$ is a vector in $ \varmathbb{R}^m$ and $ W$ is a subspace of $ \varmathbb{R}^m$ spanned by the vectors $ \mathbf{w}_1, \mathbf{w}_2, \ldots,
\mathbf{w}_n$. We want to find the orthogonal projection of $ \mathbf{b}$ onto $ W$. Form the matrix $ \varmathbb{A}$ whose columns are the vectors $ \mathbf{w}_1, \mathbf{w}_2, \ldots,
\mathbf{w}_n$ and then solve the normal system

$\displaystyle \varmathbb{A}^t \varmathbb{A} \mathbf{x} = \varmathbb{A}^t \mathbf{b}
$

The solution vector $ \mathbf{x}$ when premultiplied by $ \varmathbb{A}$ gives the orthogonal projection of $ \mathbf{b}$ onto $ W$, i.e.,

$\displaystyle \mathrm{Proj}_W(\mathbf{b}) = \varmathbb{A} \mathbf{x}
$

An alternative representation (from Mathworld) is that if the subspace $ W$ is represented by the orthonormal basis $ {\mathbf{w}_1,
\mathbf{w}_2, \ldots, \mathbf{w}_n}$ then the orthogonal projection onto $ W$ is given by

$\displaystyle \mathrm{Proj}_W(\mathbf{b}) = \sum_{i=1}^{n} \langle \mathbf{b},
\mathbf{w}_i \rangle \mathbf{w}_i
$

Here $ \langle, \rangle$ is termed the inner product and is the generalization of the dot product.

Yet another definition says that if $ {\mathbf{w}_1,
\mathbf{w}_2, \ldots, \mathbf{w}_n}$ is the orthonormal basis for a subspace $ W$ then

$\displaystyle \mathrm{Proj}_W(\mathbf{b}) = \varmathbb{A} \varmathbb{A}^t \mathbf{b}
$

Note that this is valid when $ \mathbf{b} \subset \varmathbb{R}^n$, i.e., the same space as $ W$.


next up previous contents
: Optimization : Maths : 目次   目次
平成16年8月12日