User`s guide
Algorithms for Recursive Estimation
ˆ
θ t
()
is the parameter estimate at time t. y(t) istheobservedoutputattimet
and
ˆ
yt
()
is the prediction of y(t) based on obs ervations u p to time t-1.The
gain, K(t), determines how much the current prediction error
yt yt
()
−
()
ˆ
affects the update of the parameter estimate. The estimation algorithms
minimize the prediction-error term
yt yt
()
−
()
ˆ
.
The gain h as the following gene ral form:
Kt Qt t
()
=
() ()
ψ
The recursive algorithms supported by the System Identification Toolbox
product differ based on different approaches for choosing the form of Q(t) and
computing
ψ t
()
,where
ψ t
()
represents the gradient of the predicted model
output
ˆ
|yt θ
()
with respect to the parameters
θ
.
The simple st way to visual ize the role of the gradient
ψ t
()
of the p aramete rs,
is to consider models with a linear-regression form:
yt t t et
T
()
=
() ()
+
()
ψθ
0
In this equation,
ψ t
()
is the regression vector that is computed based o n
previous values of measured inputs and outputs.
θ
0
t
()
represents the true
parameters. e(t) is the noise source ( innovations), which is assume d to be
white noise. The specificformof
ψ t
()
depends on the structure of the
polynomial model.
For linear regression equations, the predicted output is given by the following
equation:
ˆ
ˆ
yt t t
T
()
=
()
−
()
ψθ1
7-7