
Radial Basis Function Network
Introduction
This program demonstrates some function approximation
capabilities of a Radial Basis Function Network.
The user supplies a set of training points which represent some
"sample" points for some arbitrary curve. Next, the user specifies
the number of equally spaced gaussian centers and the variance for
the network. Using the training samples, the weights multiplying
each of the gaussian basis functions arecalculated using the
pseudoinverse (yielding the minimum leastsquares solution). The
resulting network is then used to approximate the function between
the given "sample" points.
Credits
This Java applet (© 1996 
Jesse W. Hong,
Massachusetts Institute of Technology) was integrated as is into
this web page with the agreement of the author. Please send
comments about this applet directly to
jesse@mit.edu.
Instructions
 Use the first mouse button to place training points in the
upper section. Try to space them approximately equally, and place
more points than number of centers being used.
 Adjust the number of centers and the standard deviation (width
of Gaussian) for the Gaussians used in the approximation.
 Click on "Redraw" to redraw the basis functions and the
training points. After any adjustment to the number of centers or
variance, click on this button to show the new basis
functions.
 Click on "Go!" to train the network.
 The "Reset" button erases all the training points and lets you
start again.
 Status messages are shown at the bottom of the
applet.
Comments
After learning, the scaled plots of each of the gaussians is
shown in green in the upper graph. The resulting approximation of
the curve which is the sum of all the scaled gaussians is shown in
red.
Try to make sure that your data points are equally spaced. If
they are not and you have a lot of centers, then in between some of
the data points, the fitted curve may go off screen. Just add a new
point in these regions and click on "Go!" again.
Play around with the number of centers and the standard
deviation and see how the smoothness and accuracy of the
approximation is affected. E.g. make the standard deviation 0.25,
place 10 equally spaced data points, click on "Go!", then make the
standard deviation 1.0, and click on "Go!" again. You can also play
with the number of data points, etc.
Questions
 Put down 32 data points so that the data looks like a noisy
version of some function. Take the width of the Gaussians as
1.5.
a) As a first step, try to do an interpolation, that is, use 32
Gaussians. Type 'go' and look at the result. Then reduce the number
of centers to 16, then to 8, and finally to 4. What is the
result?
b) Repeat the same sequence but with a width of 1.0 and 0.5. What
is the result?
 Put down about 30 data points in two clusters, one for low
xvalues and another one for high xvalues. Take 10 centers.
a) Do a fit with standard deviation 1.5, then with 1.0. Is the
result reasonable?
b) Now add one extra data point in the middle between the two
clusters. Is the result getting better? Try to optimize parameters
(number of centers and standard deviation).
