Categories
  Encyclosphere.org ENCYCLOREADER
  supported by EncyclosphereKSF

Two-dimensional singular-value decomposition

From HandWiki - Reading time: 2 min

Short description: Method of decomposing a set of matrices via low-rank approximation


In linear algebra, two-dimensional singular-value decomposition (2DSVD) computes the low-rank approximation of a set of matrices such as 2D images or weather maps in a manner almost identical to SVD (singular-value decomposition) which computes the low-rank approximation of a single matrix (or a set of 1D vectors).

SVD

Let matrix [math]\displaystyle{ X = [\mathbf x_1, \ldots, \mathbf x_n] }[/math] contains the set of 1D vectors which have been centered. In PCA/SVD, we construct covariance matrix [math]\displaystyle{ F }[/math] and Gram matrix [math]\displaystyle{ G }[/math]

[math]\displaystyle{ F = X X^\mathsf{T} }[/math] , [math]\displaystyle{ G = X^\mathsf{T} X, }[/math]

and compute their eigenvectors [math]\displaystyle{ U = [\mathbf u_1, \ldots, \mathbf u_n] }[/math] and [math]\displaystyle{ V = [\mathbf v_1, \ldots, \mathbf v_n] }[/math]. Since [math]\displaystyle{ VV^\mathsf{T} = I }[/math] and [math]\displaystyle{ UU^\mathsf{T} = I }[/math] we have

[math]\displaystyle{ X = UU^\mathsf{T} X VV^\mathsf{T} = U \left(U^\mathsf{T} XV\right) V^\mathsf{T} = U \Sigma V^\mathsf{T}. }[/math]

If we retain only [math]\displaystyle{ K }[/math] principal eigenvectors in [math]\displaystyle{ U , V }[/math], this gives low-rank approximation of [math]\displaystyle{ X }[/math].

2DSVD

Here we deal with a set of 2D matrices [math]\displaystyle{ (X_1,\ldots,X_n) }[/math]. Suppose they are centered [math]\displaystyle{ \sum_i X_i =0 }[/math]. We construct row–row and column–column covariance matrices

[math]\displaystyle{ F = \sum_i X_i X_i^\mathsf{T} }[/math] and [math]\displaystyle{ G = \sum_i X_i^\mathsf{T} X_i }[/math]

in exactly the same manner as in SVD, and compute their eigenvectors [math]\displaystyle{ U }[/math] and [math]\displaystyle{ V }[/math]. We approximate [math]\displaystyle{ X_i }[/math] as

[math]\displaystyle{ X_i = U U^\mathsf{T} X_i V V^\mathsf{T} = U \left(U^\mathsf{T} X_i V\right) V^\mathsf{T} = U M_i V^\mathsf{T} }[/math]

in identical fashion as in SVD. This gives a near optimal low-rank approximation of [math]\displaystyle{ (X_1,\ldots,X_n) }[/math] with the objective function

[math]\displaystyle{ J= \sum_{i=1}^n \left| X_i - L M_i R^\mathsf{T}\right| ^2 }[/math]

Error bounds similar to Eckard–Young theorem also exist.

2DSVD is mostly used in image compression and representation.

References

  • Chris Ding and Jieping Ye. "Two-dimensional Singular Value Decomposition (2DSVD) for 2D Maps and Images". Proc. SIAM Int'l Conf. Data Mining (SDM'05), pp. 32–43, April 2005. http://ranger.uta.edu/~chqding/papers/2dsvdSDM05.pdf
  • Jieping Ye. "Generalized Low Rank Approximations of Matrices". Machine Learning Journal. Vol. 61, pp. 167–191, 2005.




Licensed under CC BY-SA 3.0 | Source: https://handwiki.org/wiki/Two-dimensional_singular-value_decomposition
1 | Status: cached on July 19 2024 10:43:18
↧ Download this article as ZWI file
Encyclosphere.org EncycloReader is supported by the EncyclosphereKSF