Papers
Topics
Authors
Recent
Search
2000 character limit reached

Tensor decompositions and algorithms, with applications to tensor learning

Published 12 Oct 2021 in math.NA and cs.NA | (2110.05997v1)

Abstract: A new algorithm of the canonical polyadic decomposition (CPD) presented here. It features lower computational complexity and memory usage than the available state of the art implementations. We begin with some examples of CPD applications to real world problems. A short summary of the main contributions in this work follows. In chapter 1 we review classical tensor algebra and geometry, with focus on the CPD. Chapter 2 focuses on tensor compression, which is considered (in this work) to be one of the most important parts of the CPD algorithm. In chapter 3 we talk about the Gauss-Newton method, which is a nonlinear least squares method used to minimize nonlinear functions. Chapter 4 is the longest one of this thesis. In this chapter we introduce the main character of this thesis: Tensor Fox. Basically it is a tensor package which includes a CPD solver. After introducing Tensor Fox we will conduct lots of computational experiments comparing this solver with several others. At the end of this chapter we introduce the Tensor Train decomposition and show how to use it to compute higher order CPDs. We also discuss some important details such as regularization, preconditioning, conditioning, parallelism, etc. In chapter 5 we consider the intersection between tensor decompositions and machine learning. A novel model is introduced, which works as a tensor version of neural networks. Finally, in chapter 6 we reach the final conclusions and introduce our expectations for future developments.

Citations (1)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.