Line 34: | Line 34: | ||
The singular value decomposition can be computed using the following observations: | The singular value decomposition can be computed using the following observations: | ||
− | * The left-singular vectors of | + | * The left-singular vectors of '''A''' are a set of orthonormal eigenvectors of <math>\bold{A}\bold{A}^T</math>. |
− | * The right-singular vectors of | + | * The right-singular vectors of '''A''' are a set of orthonormal eigenvectors of <math>\bold{A}^T\bold{A}</math>. |
− | * The non-zero singular values of | + | * The non-zero singular values of '''A''' (found on the diagonal entries of Σ) are the square roots of the non-zero eigenvalues of both <math>\bold{A}^T\bold{A}</math> and <math>\bold{A}\bold{A}^T</math>. |
Revision as of 17:22, 30 July 2017
Contents
Matrix Decomposition Methods
QR Decomposition
QR Decomposition, or QR Factorization is the process of decomposing a matrix A this way: <math>\left [ \bold{A} \right ] = \left [ \bold{Q} \right ] \cdot \left [ \bold{R} \right ]</math>
Where:
- Q is an orthogonal matrix
- R is an upper triangular matrix
Applications
QR decomposition is often used to solve the linear least squares problem, and is the basis for a particular eigenvalue algorithm, the QR algorithm.
SVD Decomposition
Singular Value Decomposition (SVD) is the process of decomposing a matrix A this way:
<math>\left [ \bold{A} \right ] = \left [ \bold{U} \right ] \cdot \left [ \bold{\Sigma} \right ] \cdot \left [ \bold{V} \right ]^T</math>
Where:
- A is an <math>m \times n</math> real or complex matrix
- U is an <math>m \times m</math> real or complex unitary matrix
- <math>\bold{\Sigma}</math> is an <math>m \times n</math> rectangular diagonal matrix with non-negative real numbers on the diagonal
- V is an <math>n \times n</math> real or complex unitary matrix.
The diagonal entries <math>\sigma_i</math> of <math>\bold{\Sigma}</math> are known as the singular values of A. The columns of U and the columns of V are called the left-singular vectors and right-singular vectors of A, respectively.
The singular value decomposition can be computed using the following observations:
- The left-singular vectors of A are a set of orthonormal eigenvectors of <math>\bold{A}\bold{A}^T</math>.
- The right-singular vectors of A are a set of orthonormal eigenvectors of <math>\bold{A}^T\bold{A}</math>.
- The non-zero singular values of A (found on the diagonal entries of Σ) are the square roots of the non-zero eigenvalues of both <math>\bold{A}^T\bold{A}</math> and <math>\bold{A}\bold{A}^T</math>.
Applications
Applications that employ the SVD include computing the pseudoinverse, least squares fitting of data, multivariable control, matrix approximation, and determining the rank, range and null space of a matrix.