Zoom out Search Issue

IEEE SIGNAL PROCESSING MAGAZINE [146] MARCH 2015
concepts are supported by illustrative real-world case studies that
highlight the benefits of the tensor framework as efficient and
promising tools, inter alia, for modern signal processing, data ana-
lysis, and machine-learning applications; moreover, these benefits
also extend to vector/matrix data through tensorization.
HISTORICAL NOTES
The roots of multiway analysis can be traced back to studies of
homogeneous polynomials in the 19th century, with contributors
including Gauss, Kronecker, Cayley, Weyl, and Hilbert. In the
modern-day interpretation, these are fully symmetric tensors.
Decompositions of nonsymmetric tensors have been studied since
the early 20th century [1], whereas the benefits of using more
than two matrices in factor analysis (FA) [2] have been apparent in
several communities since the 1960s. The Tucker decomposition
(TKD) for tensors was introduced in psychometrics [3], [4], while
the canonical polyadic decomposition (CPD) was independently
rediscovered and put into an application context under the names
of canonical decomposition (CANDECOMP) in psychometrics [5]
and parallel factor model (PARAFAC) in linguistics [6]. Tensors
were subsequently adopted in diverse branches of data analysis
such as chemometrics, the food industry, and social sciences [7],
[8]. When it comes to signal processing, the early 1990s saw a
considerable interest in higher-order statistics (HOS) [9], and it
was soon realized that, for multivariate cases, HOS are effectively
higher-order tensors; indeed, algebraic approaches to independent
component analysis (ICA) using HOS [10]–[12] were inherently
tensor based. Around 2000, it was realized that the TKD repre-
sents a multilinear singular value decomposition (MLSVD) [15].
Generalizing the matrix singular value decomposition (SVD), the
workhorse of numerical linear algebra, the MLSVD spurred the
interest in tensors in applied mathematics and scientific comput-
ing in very high dimensions [16]–[18]. In parallel, CPD was suc-
cessfully adopted as a tool for sensor array processing and
deterministic signal separation in wireless communication [19],
[20]. Subsequently, tensors have been used in audio, image and
video processing, machine learning, and biomedical applications,
to name but a few areas. The significant interest in tensors and
their quickly emerging applications is reflected in books [7], [8],
[12], [21]–[23] and tutorial papers [24]–[31] covering various
aspects of multiway analysis.
FROM A MATRIX TO A TENSOR
Approaches to two-way (matrix) component analysis are well estab-
lished and include principal component analysis (PCA), ICA, non-
negative matrix factorization (NMF), and sparse component analysis
(SCA) [12], [21], [32]. These techniques have become standard tools
for, e.g., blind source separation (BSS), feature extraction, or classifi-
cation. On the other hand, large classes of data arising from modern
heterogeneous sensor modalities have a multiway character and are,
therefore, naturally represented by multiway arrays or tensors (see
the section “Tensorization—Blessing of Dimensionality”).
Early multiway data analysis approaches reformatted the data
tensor as a matrix and resorted to methods developed for classical
two-way analysis. However, such a flattened view of the world and
the rigid assumptions inherent in two-way analysis are not always a
good match for multiway data. It is only through higher-order ten-
sor decomposition that we have the opportunity to develop sophis-
ticated models capturing multiple interactions and couplings
instead of standard pairwise interactions. In other words, we can
only discover hidden components within multiway data if the ana-
lysis tools account for the intrinsic multidimensional patterns pre-
sent, motivating the development of multilinear techniques.
In this article, we emphasize that tensor decompositions are
not just matrix factorizations with additional subscripts, multi-
linear algebra is much more structurally rich than linear alge-
bra. For example, even basic notions such as rank have a more
subtle meaning, the uniqueness conditions of higher-order ten-
sor decompositions are more relaxed and accommodating than
those for matrices [33], [34], while matrices and tensors also
have completely different geometric properties [22]. This boils
down to matrices representing linear transformations and quad-
ratic forms, while tensors are connected with multilinear map-
pings and multivariate polynomials [31].
NOTATIONS AND CONVENTIONS
A tensor can be thought of as a multi-index numerical array,
whereby the order of a tensor is the number of its modes or
[TABLE 1] BASIC NOTATION.
,, aa,AA TENSOR, MATRIX, VECTOR, SCALAR
[, , , ]aa aA
R12
f= MATRIX A WITH COLUMN VECTORS a
r
(:, , , , )ii ia
N23
f FIBER OF TENSOR A OBTAINED BY FIXING ALL BUT ONE INDEX
(:,:, , , )iiA
N3
f MATRIX SLICE OF TENSOR A OBTAINED BY FIXING ALL BUT TWO INDICES
(:,:,:, , , )iiA
N4
f TENSOR SLICE OF A OBTAINED BY FIXING SOME INDICES
(,,, )AI I I
N12
f SUBTENSOR OF A OBTAINED BY RESTRICTING INDICES TO BELONG TO SUBSETS
{,, , }I12I
nn
f3
A R
()n
III I II
nnnN12 1 1
!
# gg
-+
MODE- n MATRICIZATION OF TENSOR RA
II I
N12
!
## #g
WHOSE ENTRY AT ROW i
n
AND
COLUMN ( ) ( )iIIII i Ii11
nn N N N N1211 1
ggg-++-+
-+ -
IS EQUAL TO a
ii i
N1
f
2
Avec R
II I
NN 11
!
g
-
^
h
VECTORIZATION OF TENSOR RA
II I
N12
!
## #g
WITH THE ENTRY AT POSITION
[( ) ]iiIII1
kk
k
N
1121
2
g+-
-
=
/
EQUAL TO a
ii i
1
f
N
2
(,,,)diagD
R12
fmm m= DIAGONAL MATRIX WITH d
rr r
m=
(,, ,)diagD
NR12
fmm m= DIAGONAL TENSOR OF ORDER N WITH d
rr
rr
m=
g
,A
T
,A
1-
A
@
TRANSPOSE, INVERSE, AND MOORE–PENROSE PSEUDOINVERSE
Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next Page
q
q
M
M
q
q
M
M
q
M
THE WORLD’S NEWSSTAND
®
Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next Page
q
q
M
M
q
q
M
M
q
M
THE WORLD’S NEWSSTAND
®