Papers
Topics
Authors
Recent
Search
2000 character limit reached

Extension of PCA to Higher Order Data Structures: An Introduction to Tensors, Tensor Decompositions, and Tensor PCA

Published 2 Mar 2018 in eess.SP | (1803.00704v2)

Abstract: The widespread use of multisensor technology and the emergence of big data sets have brought the necessity to develop more versatile tools to represent higher-order data with multiple aspects and high dimensionality. Data in the form of multidimensional arrays, also referred to as tensors, arises in a variety of applications including chemometrics, hyperspectral imaging, high resolution videos, neuroimaging, biometrics, and social network analysis. Early multiway data analysis approaches reformatted such tensor data as large vectors or matrices and then resorted to dimensionality reduction methods developed for classical two-way analysis such as PCA. However, one cannot discover hidden components within multiway data using conventional PCA. To this end, tensor decomposition methods which are flexible in the choice of the constraints and that extract more general latent components have been proposed. In this paper, we review the major tensor decomposition methods with a focus on problems targeted by classical PCA. In particular, we present tensor methods that aim to solve three important challenges typically addressed by PCA: dimensionality reduction, i.e. low-rank tensor approximation, supervised learning, i.e. learning linear subspaces for feature extraction, and robust low-rank tensor recovery. We also provide experimental results to compare different tensor models for both dimensionality reduction and supervised learning applications.

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.