Zoom out Search Issue

IEEE SIGNAL PROCESSING MAGAZINE [158] MARCH 2015
PPtXG
()
() ()
X
r
r
R
r
r
N
r
N
1
12
11
## #g,
=
-
/
(17)
.QQuYG
()
() ()
Y
r
r
R
r
r
N
r
M
1
12
11
## #g,
=
-
/
(18)
A number of data-analytic problems can be reformulated as either
regression or similarity analysis [analysis of variance (ANOVA),
autoregressive moving average modeling (ARMA), linear discri-
minant analysis (LDA), and canonical correlation analysis (CCA)],
so that both the matrix and tensor PLS solutions can be general-
ized across exploratory data analysis.
EXAMPLE 4
The predictive power of tensor-based PLS is illustrated on a real-
world example of the prediction of arm movement trajectory from
the electrocorticogram (ECoG). Figure 14(a) illustrates the experi-
mental setup, whereby the 3-D arm movement of a monkey was
captured by an optical motion capture system with reflective
markers affixed to the left shoulder, elbow, wrist, and hand; for full
details, see http://neurotycho.org. The predictors (32 ECoG chan-
nels) naturally build a fourth-order tensor
X (time#channel_no
# epoch_length # frequency) while the movement trajectories for
the four markers (response) can be represented as a third-order
tensor
Y (time# 3D_marker_position# marker_no). The goal of
the training stage is to identify the HOPLS parameters:
,,,PQGG
() ()
() ()
rr
r
n
r
n
XY
(see Figure 13). In the test stage, the move-
ment trajectories, ,Y
*
for the new ECoG data, ,X
*
are predicted
through multilinear projections: 1) the new scores, ,t
*
r
are found
from new data, ,X
*
and the existing model parameters:
,,,,PPPG
()
() () ()
X
r
rrr
123
and 2) the predicted trajectory is calculated as
.QQQtYG
*
()
*
() () ()
r
r
R
r
rrr
1
12
1
3
2
4
3
Y
####.
=
/
In the simulations,
standard PLS was applied in the same way to the unfolded tensors.
Figure 14(c) shows that although the standard PLS was able
to predict the movement corresponding to each marker indi-
vidually, such a prediction is quite crude as the two-way PLS
does not adequately account for mutual information among the
four markers. The enhanced predictive performance of the BTD-
based HOPLS [the red line in Figure 14(c)] is therefore attrib-
uted to its ability to model interactions between complex latent
components of both predictors and responses.
LINKED MULTIWAY COMPONENT ANALYSIS
AND TENSOR DATA FUSION
Data fusion concerns the joint analysis of an ensemble of data
sets, such as multiple views of a particular phenomenon, where
some parts of the scene may be visible in only one or a few data
sets. Examples include the fusion of visual and thermal images
in low-visibility conditions and the analysis of human electro-
physiological signals in response to a certain stimulus but from
different subjects and trials; these are naturally analyzed
together by means of matrix/tensor factorizations. The coupled
nature of the analysis of such multiple data sets ensures that we
are able to account for the common factors across the data sets
and, at the same time, to guarantee that the individual compo-
nents are not shared (e.g., processes that are independent of exci-
tations or stimuli/tasks).
The linked multiway component analysis (LMWCA) [106],
shown in Figure 15, performs such a decomposition into shared
and individual factors and is formulated as a set of approxi-
mate joint TKD of a set of data tensors
,RX
()kIII
N12
!
## #g
(,,,)kK12f=
,BB BXG
() () (,) (,) ( ,)kk k k
N
Nk
1
1
2
2
## #g, (19)
where each factor matrix [, ]BBBR
(,)
()
(,)
nk
C
n
I
nk
IR
nn
!=
#
has
1) components B R
()
C
n
IC
nn
!
#
(with )CR0
nn
## that are common
(i.e., maximally correlated) to all tensors and 2) components
B R
(,)
()
I
nk
IRC
nnn
!
# -
that are tensor specific. The objective is to esti-
mate the common components ,B
()
C
n
the individual components
,B
(,)
I
nk
and, via the core tensors ,G
()k
their mutual interactions. As
in MWCA (see the section “Tucker Decomposition”), constraints
may be imposed to match data properties [73], [76]. This enables a
more general and flexible framework than group ICA and independ-
ent vector analysis, which also performs linked analysis of multiple
data sets but assume that 1) there exist only common components
and 2) the corresponding latent variables are statistically independ-
ent [107], [108]. Both are quite stringent and limiting assumptions.
As an alternative to TKD, coupled tensor decompositions may be of
a polyadic or even block term type [89], [109].
[FIG13] The principle of HOPLS for third-order tensors. The core
tensors G
X
and G
Y
are block-diagonal. The BTD-type structure
allows for the modeling of general components that are highly
correlated in the first mode.
Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next Page
q
q
M
M
q
q
M
M
q
M
THE WORLD’S NEWSSTAND
®
Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next Page
q
q
M
M
q
q
M
M
q
M
THE WORLD’S NEWSSTAND
®
=
...
=
...
+ ··· +
+ ··· +
t
1
t
R
P
(2)
(I
3
× L
3
)
1
Q
(2)
(J
3
× L
3
)
1
Q
(2)
(J
3
× L
3
)
R
P
(2)
(I
3
× RL
3
)
P
(1)T
P
(1)T
1
P
(1)T
P
(2)
(I
3
× L
3
)
R
R
(I
1
× I
2
× I
3
)
(I )
u
1
(I
1
× J
2
× J
3
)
(J )
(I )
(L
2
× I
2
)(L
2
× I
2
)
(RL
2
× I
2
)(R × RL
2
× RL
3
)
(I
1
× R)
T
=
~
=
~
X
u
R
Q
(2)
(J
3
× RL
3
)
Q
(1)T
Q
(1)T
1
Q
(1)T
R
(J )
(L
2
× J
2
)(L
2
× J
2
)
(RL
2
× J
2
)(R × RL
2
× RL
3
)
(J
1
× R)
U
Y