(ch:svd)= # SVD: Singular Value Decomposition The Singular Value Decomposition (`SVD`) solver object can be used for computing a partial SVD of a rectangular matrix, and other related problems. It provides uniform and efficient access to several specific SVD solvers included in SLEPc, and also gives the possibility to compute the decomposition via the eigensolvers provided in the `EPS` package. In many aspects, the user interface of `SVD` resembles that of `EPS`. For this reason, this chapter and chapter [](#ch:eps) have a very similar structure. {#sec:svdback label="sec:svdback"} ## Mathematical Background This section provides some basic concepts about the singular value decomposition and other related problems. The objective is to set up the notation and also to justify some of the solution approaches, particularly those based on the `EPS` object. As in the case of eigensolvers, some of the implemented methods are described in detail in the SLEPc [technical reports](#str). For background material about the SVD, see for instance {cite:p}`Bai:2000:TSA{ch. 6}`. Many other books such as {cite:p}`Bjorck:1996:NML` or {cite:p}`Hansen:1998:RDI` present the SVD from the perspective of its application to the solution of least squares problems and other related linear algebra problems. {#sec:svd label="sec:svd"} ### The (Standard) Singular Value Decomposition (SVD) The singular value decomposition (SVD) of an $m\times n$ matrix $A$ can be written as ```{math} :label: eq:svd A=U\Sigma V^*, ``` where $U=[u_1,\ldots,u_m]$ is an $m\times m$ unitary matrix ($U^*U=I$), $V=[v_1,\ldots,v_n]$ is an $n\times n$ unitary matrix ($V^*V=I$), and $\Sigma$ is an $m\times n$ diagonal matrix with real diagonal entries $\Sigma_{ii}=\sigma_i$ for $i=1,\ldots,\min\{m,n\}$. If $A$ is real, $U$ and $V$ are real and orthogonal. The vectors $u_i$ are called the left singular vectors, the $v_i$ are the right singular vectors, and the $\sigma_i$ are the singular values. ```{figure} ../../_static/images/manual/svg/fig-svd.svg :alt: Scheme of the thin SVD of a rectangular matrix $A$. :name: fig:svd Scheme of the thin SVD of a rectangular matrix $A$. ``` In the following, we will assume that $m\geq n$. If $m\sigma_{r+1}=\ldots=\sigma_n=0$, where $r=\mathrm{rank}(A)$. It can be shown that $\{u_1,\ldots,u_r\}$ span the range of $A$, $\mathcal{R}(A)$, whereas $\{v_{r+1},\ldots,v_n\}$ span the null space of $A$, $\mathcal{N}(A)$. If the zero singular values are dropped from the sum in equation {math:numref}`eq:svdouter`, the resulting factorization, $A=\sum_{i=1}^{r}\sigma_iu_iv_i^*$, is called the *compact* SVD, $A=U_r\Sigma_r V_r^*$. In the case of a very large and sparse $A$, it is usual to compute only a subset of $k\leq r$ singular triplets. We will refer to this decomposition as the *truncated* SVD of $A$. It can be shown that the matrix $A_k=U_k\Sigma_k V_k^*$ is the best rank-$k$ approximation to matrix $A$, in the least squares sense. In general, one can take an arbitrary subset of the summands in equation {math:numref}`eq:svdouter`, and the resulting factorization is called the *partial* SVD of $A$. As described later in this chapter, SLEPc allows the computation of a partial SVD corresponding to either the $k$ largest or smallest singular triplets. #### Equivalent Eigenvalue Problems It is possible to formulate the problem of computing the singular triplets of a matrix $A$ as an eigenvalue problem involving a Hermitian matrix related to $A$. There are two possible ways of achieving this: 1. With the *cross product* matrix, either $A^*A$ or $AA^*$. 2. With the *cyclic* matrix, $H(A)=\left[\begin{smallmatrix}0&A\\A^*&0\end{smallmatrix}\right]$. In SLEPc, the computation of the SVD is usually based on one of these two alternatives, either by passing one of these matrices to an `EPS` object or by performing the computation implicitly. By pre-multiplying equation {math:numref}`eq:svdleft` by $A^*$ and then using equation {math:numref}`eq:svdright`, the following relation results ```{math} :label: eq:eigleft A^*Av_i=\sigma_i^2v_i, ``` that is, the $v_i$ are the eigenvectors of matrix $A^*A$ with corresponding eigenvalues equal to $\sigma_i^2$. Note that after computing $v_i$ the corresponding left singular vector, $u_i$, is readily available through equation {math:numref}`eq:svdleft` with just a matrix-vector product, $u_i=\frac{1}{\sigma_i}Av_i$. Alternatively, one could first compute the left vectors and then the right ones. For this, pre-multiply equation {math:numref}`eq:svdright` by $A$ and then use equation {math:numref}`eq:svdleft` to get ```{math} :label: eq:eigright AA^*u_i=\sigma_i^2u_i. ``` In this case, the right singular vectors are obtained as $v_i=\frac{1}{\sigma_i}A^*u_i$. The two approaches represented in equations {math:numref}`eq:eigleft` and {math:numref}`eq:eigright` are very similar. Note however that $A^*A$ is a square matrix of order $n$ whereas $AA^*$ is of order $m$. In cases where $m\gg n$, the computational effort will favor the $A^*A$ approach. On the other hand, the eigenproblem equation {math:numref}`eq:eigleft` has $n-r$ zero eigenvalues and the eigenproblem equation {math:numref}`eq:eigright` has $m-r$ zero eigenvalues. Therefore, continuing with the assumption that $m\geq n$, even in the full rank case the $AA^*$ approach may have a large null space resulting in difficulties if the smallest singular values are sought. In SLEPc, this will be referred to as the cross product approach and will use whichever matrix is smaller, either $A^*A$ or $AA^*$. Computing the SVD via the cross product approach may be adequate for determining the largest singular triplets of $A$, but the loss of accuracy can be severe for the smallest singular triplets. The cyclic matrix approach is an alternative that avoids this problem, but at the expense of significantly increasing the cost of the computation. Consider the eigendecomposition of ```{math} :label: eq:cyclic H(A)=\begin{bmatrix}0&A\\A^*&0\end{bmatrix}, ``` which is a Hermitian matrix of order $(m+n)$. It can be shown that $\pm\sigma_i$ is a pair of eigenvalues of $H(A)$ for $i=1,\ldots,r$ and the other $m+n-2r$ eigenvalues are zero. The unit eigenvectors associated with $\pm\sigma_i$ are $\frac{1}{\sqrt{2}}\left[\begin{smallmatrix}\pm u_i\\v_i\end{smallmatrix}\right]$. Thus it is possible to extract the singular values and the left and right singular vectors of $A$ directly from the eigenvalues and eigenvectors of $H(A)$. Note that in this case the singular values are not squared, and therefore the computed values will be more accurate (especially the small ones). The drawback in this case is that small eigenvalues are located in the interior of the spectrum. {#sec:gsvd label="sec:gsvd"} ### The Generalized Singular Value Decomposition (GSVD) An extension of the SVD to the case of two matrices is the generalized singular value decomposition (GSVD), which can be applied in constrained least squares problems, among others. An overview of the problem can be found in {cite:p}`Golub:1996:MC{8.7.3}`. Consider two matrices, $A\in\mathbb{C}^{m\times n}$ with $m\geq n$ and $B\in\mathbb{C}^{p\times n}$. Note that both matrices must have the same column dimension. Then there exist two unitary matrices $U\in\mathbb{C}^{m\times m}$ and $V\in\mathbb{C}^{p\times p}$ and an invertible matrix $X\in\mathbb{C}^{n\times n}$ such that ```{math} :label: eq:gsvd U^*AX=C,\qquad V^*BX=S, ``` where $C=\mathrm{diag}(c_1,\dots,c_n)$ and $S=\mathrm{diag}(s_{n-q+1},\dots,s_n)$ with $q=\min(p,n)$. The values $c_i$ and $s_i$ are real and nonnegative, and the ratios define the generalized singular values, ```{math} :label: eq:gsvd-values \sigma(A,B)\equiv\{c_1/s_1,\dots,c_q/s_q\}, ``` and if $p