Files
Abstract
The problem of matrix completion is significant. Given a partially known matrix, the goal is to recover the missing entries as best as possible. In general we need further assumptions about the partially known matrix in order to define what "as best as possible" means. Many matrices in computational math are approximately low-rank, so we will assume that a partially known matrix is part of a low-rank matrix.
We show that the low-rank completions of a partially known matrix may be recovered by only considering functions of specific known entries.
We also introduce a new gradient descent based approach to solving the low-rank matrix completion problem. This method relies on finding a submatrix of large determinant in modulus called a dominant submatrix which is done with the maximum volume algorithm. A greedy version of this algorithm is presented which improves on the original in processing time.
Moreover, we present a new upper bound on the number of dominant submatrices in terms of the independence number of Johnson graphs.
Like matrix completion, the problem of tensor completion is to complete a partially known tensor subject to the constraint that the completion is low-rank. There are multiple notions of the rank of a tensor. We give sufficient conditions for a partially known tensor to have a unique low-multilinear rank completion. We also generalize several of our results for low-rank matrix completion to low-rank tensor completion. In particular, we introduce a Schur complement maximum volume based gradient descent method for recovering a low-rank rank completion of a partially known tensor.
We show that the low-rank completions of a partially known matrix may be recovered by only considering functions of specific known entries.
We also introduce a new gradient descent based approach to solving the low-rank matrix completion problem. This method relies on finding a submatrix of large determinant in modulus called a dominant submatrix which is done with the maximum volume algorithm. A greedy version of this algorithm is presented which improves on the original in processing time.
Moreover, we present a new upper bound on the number of dominant submatrices in terms of the independence number of Johnson graphs.
Like matrix completion, the problem of tensor completion is to complete a partially known tensor subject to the constraint that the completion is low-rank. There are multiple notions of the rank of a tensor. We give sufficient conditions for a partially known tensor to have a unique low-multilinear rank completion. We also generalize several of our results for low-rank matrix completion to low-rank tensor completion. In particular, we introduce a Schur complement maximum volume based gradient descent method for recovering a low-rank rank completion of a partially known tensor.