Return to Colloquia & Seminar listing

### Computational Complexity of Tensor Decomposition

**Special Events**

Speaker: | Alex Wein, UC Davis (Math) |

Related Webpage: | https://www.alex-wein.com/ |

Location: | 1147 MSB |

Start time: | Fri, Dec 2 2022, 11:00AM |

Tensor decomposition is an important subroutine in numerous algorithms for data analysis, including topic modeling, community detection, clustering, and dictionary learning. We consider a simple model for tensor decomposition: suppose we are given a random rank-r order-3 tensor---that is, an n-by-n-by-n array of numbers that is the sum of r random rank-1 terms---and our goal is to recover the individual rank-1 terms. In principle this decomposition task is possible when r < cn^2 for a constant c, but all known polynomial-time algorithms require r << n^{3/2}. Is this a fundamental barrier for efficient algorithms? In recent years, the computational complexity of various high-dimensional statistical tasks has been resolved in restricted-but-powerful models of computation such as statistical queries, sum-of-squares, or low-degree polynomials. However, tensor decomposition has remained elusive, largely because its hardness is not explained by a "planted versus null" testing problem. We show the first formal hardness for average-case tensor decomposition: when r >> n^{3/2}, the decomposition task is hard for algorithms that can be expressed as low-degree polynomials in the tensor entries.

This is a part of the first Joint CeDAR/UCD4IDS Conference.