computationally efficient way to update the weights of the experts, and the LMS is
known for its efficient tracking properties, which will be useful for the real data is known
to be nonstationary (property that the RLS does not handle well). For the choice of the
learning rate, the inequality (3.5) and the need of a sufficiently small learning rate to
track slow changes in one regime yielded r7s =0.0007.
0< < r < (3.5)
tap inputpower
3.4.3 Radial Basis Function Network Experts
We then turn to nonlinear neural networks for experts. An interesting form of
feedforward neural networks is a Radial Basis Function Network (RBFN) [23, 24].
A RBF network is a feedforward neural network with a single hidden layer of N
nonlinear processing units p, input weights t, and an output layer of linear units and
weights w The hidden layer computes the distance between the input vector and the
center of the RBF, defined by the weight vector t, on that processing unit. The input
vectors are processed through the non linear radial basis function, chosen to be a
Gaussian defined in (3.6), and through the linear layer as shown in (3.7).
N
output w= w, ,(input, t,) + (3.6)
input t,
o(input, t,)= exp(- 2 ) (3.7)
2cr
The distance measure between vectors is the Euclidian distance (inner product), and
is the spread of the RBF.
A RBF network is composed of two kinds of layers, and they are usually trained
separately. As one may view the training of a multilayer perception as a "curve fitting"