Parallel Algorithms for Low-Rank Approximations of Matrices and Tensors
Electronic Theses and Dissertations
Item Files
Item Details
- abstract
- Low-rank approximations are useful in the compression and interpretation of large datasets. Distributed parallel algorithms of such approximations, like those for matrices and tensors, are applicable for even larger datasets that cannot conceivably fit on one computer. In this thesis I will present parallelizing two such approximation algorithms: Hierarchical Nonnegative Matrix Factorization, and Tensor Train Rounding. In both cases, the distributed parallel algorithms outperform the state of the art.
- subject
- linear algebra
- low-rank approximation
- nonnegative matrix factorization
- parallel algorithms
- tensor decompositions
- tensor train
- contributor
- Ballard, Grey (committee chair)
- Erway, Jennifer (committee member)
- Cho, Samuel (committee member)
- date
- 2021-06-03T08:36:13Z (accessioned)
- 2021-06-03T08:36:13Z (available)
- 2021 (issued)
- degree
- Computer Science (discipline)
- identifier
- http://hdl.handle.net/10339/98822 (uri)
- language
- en (iso)
- publisher
- Wake Forest University
- title
- Parallel Algorithms for Low-Rank Approximations of Matrices and Tensors
- type
- Thesis