User`s guide

7 Recursive Techniques for Model Identification
Mathematics of the Unnormalized and Normalized Gradient
Algorithm
In the linear regression case, the gradient methods are also known as the
least mean squares (LMS ) methods.
The following set of equations summarizes the unnormalized gradient and
normalized gradient adaptation algorithm:
ˆˆ
ˆ
θθt t Kt yt yt
()
=−
()
+
() ()
()
()
1
ˆ
ˆ
yt t t
T
()
=
()
()
ψθ1
Kt Qt t
()
=
() ()
ψ
In the unnormalized gradient approach, Q(t) is the product of the gain
γ
and the identity matrix:
Qt I
()
In the normalized gradient approach, Q(t) is the product of the gain
γ
,and
the identity matrix is normalized by the magnitude of the gradient
ψ t
()
:
Qt
t
I
()
=
()
γ
ψ
2
These choices of Q(t) update the p arameters in the negative gradient direction,
where the gradient is computed with respect to the parameters.
Using the Unnormalized and Normalized Gradient Algorithms
The general syntax for the command described in “Algorithms for Recursive
Estimation” on page 7-6 is the following:
[params,y_hat]=command(data,nn,adm,ad g)
7-12