Tensor decomposition
Find sources: "Tensor decomposition" – news · newspapers · books · scholar · JSTOR (June 2021) (Learn how and when to remove this message)
In multilinear algebra, a tensor decomposition is any scheme for expressing a "data tensor" (M-way array) as a sequence of elementary operations acting on other, often simpler tensors.[1] [2] [3] Many tensor decompositions generalize some matrix decompositions.[4]
Tensors are generalizations of matrices to higher dimensions (or rather to higher orders, i.e. the higher number of dimensions) and can consequently be treated as multidimensional fields.[1] [5] The main tensor decompositions are:
- Tensor rank decomposition;[6]
- Higher-order singular value decomposition;[7]
- Tucker decomposition;
- matrix product states, and operators or tensor trains;
- Online Tensor Decompositions [8] [9] [10]
- hierarchical Tucker decomposition;[11]
- block term decomposition [12] [13] [11] [14]
Notation
[edit ]This section introduces basic notations and operations that are widely used in the field.
| Symbols | Definition |
|---|---|
| {\displaystyle {a,{\bf {a}},{\bf {a}}^{T},\mathbf {A} ,{\mathcal {A}}}} | scalar, vector, row, matrix, tensor |
| {\displaystyle {\bf {a}}={vec(.)}} | vectorizing either a matrix or a tensor |
| {\displaystyle {\bf {A}}_{[m]}} | matrixized tensor {\displaystyle {\mathcal {A}}} |
| {\displaystyle \times _{m}} | mode-m product |
Introduction
[edit ]A multi-way graph with K perspectives is a collection of K matrices {\displaystyle {X_{1},X_{2}.....X_{K}}} with dimensions I ×ばつ J (where I, J are the number of nodes). This collection of matrices is naturally represented as a tensor X of size I ×ばつ J ×ばつ K. In order to avoid overloading the term "dimension", we call an I ×ばつ J ×ばつ K tensor a three "mode" tensor, where "modes" are the numbers of indices used to index the tensor.
References
[edit ]- ^ a b Vasilescu, MAO; Terzopoulos, D (2007). "Multilinear (tensor) image synthesis, analysis, and recognition [exploratory dsp]". IEEE Signal Processing Magazine. 24 (6): 118–123. Bibcode:2007ISPM...24R.118V. doi:10.1109/MSP.2007.906024.
- ^ Kolda, Tamara G.; Bader, Brett W. (2009年08月06日). "Tensor Decompositions and Applications" . SIAM Review. 51 (3): 455–500. Bibcode:2009SIAMR..51..455K. doi:10.1137/07070111X. ISSN 0036-1445. S2CID 16074195.
- ^ Sidiropoulos, Nicholas D.; De Lathauwer, Lieven; Fu, Xiao; Huang, Kejun; Papalexakis, Evangelos E.; Faloutsos, Christos (2017年07月01日). "Tensor Decomposition for Signal Processing and Machine Learning". IEEE Transactions on Signal Processing. 65 (13): 3551–3582. arXiv:1607.01668 . Bibcode:2017ITSP...65.3551S. doi:10.1109/TSP.2017.2690524. ISSN 1053-587X. S2CID 16321768.
- ^ Bernardi, A.; Brachat, J.; Comon, P.; Mourrain, B. (2013年05月01日). "General tensor decomposition, moment matrices and applications". Journal of Symbolic Computation. 52: 51–71. arXiv:1105.1229 . doi:10.1016/j.jsc.2012年05月01日2. hdl:11572/134905. ISSN 0747-7171. S2CID 14181289.
- ^ Rabanser, Stephan; Shchur, Oleksandr; Günnemann, Stephan (2017). "Introduction to Tensor Decompositions and their Applications in Machine Learning". arXiv:1711.10781 [stat.ML].
- ^ Papalexakis, Evangelos E. (2016年06月30日). "Automatic Unsupervised Tensor Mining with Quality Assessment". Proceedings of the 2016 SIAM International Conference on Data Mining. Society for Industrial and Applied Mathematics. pp. 711–719. arXiv:1503.03355 . doi:10.1137/1.9781611974348.80. ISBN 978-1-61197-434-8. S2CID 10147789.
- ^ Vasilescu, M.A.O.; Terzopoulos, D. (2002). Multilinear Analysis of Image Ensembles: TensorFaces (PDF). Lecture Notes in Computer Science; (Presented at Proc. 7th European Conference on Computer Vision (ECCV'02), Copenhagen, Denmark). Vol. 2350. Springer, Berlin, Heidelberg. doi:10.1007/3-540-47969-4_30. ISBN 978-3-540-43745-1. Archived from the original (PDF) on 2022年12月29日. Retrieved 2023年03月19日.
- ^ Gujral, Ekta; Pasricha, Ravdeep; Papalexakis, Evangelos E. (7 May 2018). Ester, Martin; Pedreschi, Dino (eds.). Proceedings of the 2018 SIAM International Conference on Data Mining. doi:10.1137/1.9781611975321. hdl:10536/DRO/DU:30109588 . ISBN 978-1-61197-532-1. S2CID 21674935.
- ^ Gujral, Ekta; Papalexakis, Evangelos E. (9 October 2020). "OnlineBTD: Streaming Algorithms to Track the Block Term Decomposition of Large Tensors". 2020 IEEE 7th International Conference on Data Science and Advanced Analytics (DSAA). pp. 168–177. doi:10.1109/DSAA49011.2020.00029. ISBN 978-1-7281-8206-3. S2CID 227123356.
- ^ Gujral, Ekta (2022). "Modeling and Mining Multi-Aspect Graphs With Scalable Streaming Tensor Decomposition". arXiv:2210.04404 [cs.SI].
- ^ a b Vasilescu, M.A.O.; Kim, E. (2019). Compositional Hierarchical Tensor Factorization: Representing Hierarchical Intrinsic and Extrinsic Causal Factors. In The 25th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD’19): Tensor Methods for Emerging Data Science Challenges. arXiv:1911.04180 .
- ^ De Lathauwer, Lieven (2008). "Decompositions of a Higher-Order Tensor in Block Terms—Part II: Definitions and Uniqueness" . SIAM Journal on Matrix Analysis and Applications. 30 (3): 1033–1066. doi:10.1137/070690729.
- ^ Vasilescu, M.A.O.; Kim, E.; Zeng, X.S. (2021), "CausalX: Causal eXplanations and Block Multilinear Factor Analysis", Conference Proc. of the 2020 25th International Conference on Pattern Recognition (ICPR 2020), pp. 10736–10743, arXiv:2102.12853 , doi:10.1109/ICPR48806.2021.9412780, ISBN 978-1-7281-8808-9, S2CID 232046205
- ^ Gujral, Ekta; Pasricha, Ravdeep; Papalexakis, Evangelos (2020年04月20日). "Beyond Rank-1: Discovering Rich Community Structure in Multi-Aspect Graphs". Proceedings of the Web Conference 2020. Taipei Taiwan: ACM. pp. 452–462. doi:10.1145/3366423.3380129. ISBN 978-1-4503-7023-3. S2CID 212745714.
This linear algebra-related article is a stub. You can help Wikipedia by expanding it.