Zoom out Search Issue

(R
2
× I
2
)(l
1
× I
2
× I
3
)
(l
3
× R
3
)
(R
1
× R
2
× R
3
)
(l
1
× R
1
)
B
(1)
B
(1)
C
B
(1, 1)
B
(1, 1)
I
B
(3)
C
B
(3, 1)
I
B
(3)
C
B
(3, K )
I
B
(2)T
C
B
(2, 1)T
I
B
(2)T
C
B
(2, K )T
I
B
(3, 1)
B
(2, 1)T
(1)
=
=
~
(1)
C
I
B
(1, K )
B
(1, K )
B
(3, K )
B
(2, K )T
(K )
~
(K )
LWCA
KNN−PCA
LDA−PCA
Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next Page
q
q
M
M
q
q
M
M
q
M
THE WORLD’S NEWSSTAND
®
Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next Page
q
q
M
M
q
q
M
M
q
M
THE WORLD’S NEWSSTAND
®
(a)
10 20 30 40 50
65
70
75
80
85
90
95
Accuracy (%)
Proportion of Training Data Used (%)
(b)
Sample Images from Different and Same Categories
Training Data Common Features
for Each Category
LMWCA
Apple
Test Sample
LMWCA
Cow
Find the Class
Whose Common
Features Best
Match the Test
Sample
[FIG16] The classification of color objects belonging to different
categories. By using only common features, LMWCA achieves a
high classification rate, even when the training set is small. (a)
Classification based on LMWCA. (b) Performance comparison.
[FIG15] Coupled TKD for LMWCA. The data tensors have
both shared and individual components. Constraints such
as orthogonality, statistical independence, sparsity, and
nonnegativity may be imposed where appropriate.
IEEE SIGNAL PROCESSING MAGAZINE [160] MARCH 2015
The N-way toolbox, which includes (constrained) CPD,
TKD, and PLS in the context of chemometrics applications
[114]; many of these methods can handle constraints (e.g.,
nonnegativity and orthogonality) and missing elements.
The TT toolbox, the Hierarchical Tucker toolbox, and the
Tensor Calculus library provide tensor tools for scientific
computing [115]–[117].
Code developed for multiway analysis is also available from
the Three-Mode Company [118].
CONCLUSIONS AND FUTURE DIRECTIONS
We live in a world overwhelmed by data, from multiple pictures
of Big Ben on various social Web links to terabytes of data in
multiview medical imaging, while we may also need to repeat
the scientific experiments many times to obtain the ground
truth. Each snapshot gives us a somewhat incomplete view of
the same object and involves different angles, illumination,
lighting conditions, facial expressions, and noise.
We have shown that tensor decompositions are a perfect
match for exploratory analysis of such multifaceted data sets
and have illustrated their applications in multisensor and multi-
modal signal processing. Our emphasis has been to show that
tensor decompositions and multilinear algebra open up com-
pletely new possibilities for component analysis, as compared
with the flat view of standard two-way methods.
Unlike matrices, tensors are multiway arrays of data samples
whose representations are typically overdetermined (fewer
parameters in the decomposition than the number of data
entries). This gives us an enormous flexibility in finding hidden
components in data and the ability to enhance both robustness
to noise and tolerance to missing data samples and faulty
sensors. We have also discussed multilinear variants of several
standard signal processing tools such as multilinear SVD, ICA,
NMF, and PLS and have shown that tensor methods can operate
in a deterministic way on signals of very short duration.
At present, the uniqueness conditions of standard tensor
models are relatively well understood and efficient computation
algorithms do exist. However, for future applications, several
challenging problems remain to be addressed in more depth.
A whole new area emerges when several decompositions
that operate on different data sets are coupled, as in multi-
view data where some details of interest are visible in, e.g.,
only one mode. Such techniques need theoretical support in
terms of existence, uniqueness, and numerical properties.
As the complexity of advanced models increases, their
computation requires efficient iterative algorithms, extend-
ing beyond the ALS class.