Two-dimensional singular-value decomposition
In linear algebra, two-dimensional singular-value decomposition (2DSVD) computes the low-rank approximation of a set of matrices such as 2D images or weather maps in a manner almost identical to SVD (singular-value decomposition) which computes the low-rank approximation of a single matrix (or a set of 1D vectors).
SVD
[edit ]Let matrix {\displaystyle X=[\mathbf {x} _{1},\ldots ,\mathbf {x} _{n}]} contains the set of 1D vectors which have been centered. In PCA/SVD, we construct covariance matrix {\displaystyle F} and Gram matrix {\displaystyle G}
- {\displaystyle F=XX^{\mathsf {T}}} , {\displaystyle G=X^{\mathsf {T}}X,}
and compute their eigenvectors {\displaystyle U=[\mathbf {u} _{1},\ldots ,\mathbf {u} _{n}]} and {\displaystyle V=[\mathbf {v} _{1},\ldots ,\mathbf {v} _{n}]}. Since {\displaystyle VV^{\mathsf {T}}=I} and {\displaystyle UU^{\mathsf {T}}=I} we have
- {\displaystyle X=UU^{\mathsf {T}}XVV^{\mathsf {T}}=U\left(U^{\mathsf {T}}XV\right)V^{\mathsf {T}}=U\Sigma V^{\mathsf {T}}.}
If we retain only {\displaystyle K} principal eigenvectors in {\displaystyle U,V}, this gives low-rank approximation of {\displaystyle X}.
2DSVD
[edit ]Here we deal with a set of 2D matrices {\displaystyle (X_{1},\ldots ,X_{n})}. Suppose they are centered {\textstyle \sum _{i}X_{i}=0}. We construct row–row and column–column covariance matrices
- {\displaystyle F=\sum _{i}X_{i}X_{i}^{\mathsf {T}}} and {\displaystyle G=\sum _{i}X_{i}^{\mathsf {T}}X_{i}}
in exactly the same manner as in SVD, and compute their eigenvectors {\displaystyle U} and {\displaystyle V}. We approximate {\displaystyle X_{i}} as
- {\displaystyle X_{i}=UU^{\mathsf {T}}X_{i}VV^{\mathsf {T}}=U\left(U^{\mathsf {T}}X_{i}V\right)V^{\mathsf {T}}=UM_{i}V^{\mathsf {T}}}
in identical fashion as in SVD. This gives a near optimal low-rank approximation of {\displaystyle (X_{1},\ldots ,X_{n})} with the objective function
- {\displaystyle J=\sum _{i=1}^{n}\left|X_{i}-LM_{i}R^{\mathsf {T}}\right|^{2}}
Error bounds similar to Eckard–Young theorem also exist.
2DSVD is mostly used in image compression and representation.
References
[edit ]- Chris Ding and Jieping Ye. "Two-dimensional Singular Value Decomposition (2DSVD) for 2D Maps and Images". Proc. SIAM Int'l Conf. Data Mining (SDM'05), pp. 32–43, April 2005. http://ranger.uta.edu/~chqding/papers/2dsvdSDM05.pdf
- Jieping Ye. "Generalized Low Rank Approximations of Matrices". Machine Learning Journal. Vol. 61, pp. 167–191, 2005.