Fundamentals of Statistics contains material of various lectures and courses of H. Lohninger on statistics, data analysis and chemometrics......click here for more. 
Home Multivariate Data Modeling Neural Networks RBF Neural Networks RBF Network as Kernel Estimator  
See also: RBF network  
RBF Network as Kernel Estimator
RBF neural networks belong to the class of kernel estimation methods. These methods use a weighted sum of a finite set of nonlinear functions Φ(xc_{i}) to approximate an unknown function f(x). The approximation is constructed from the data samples presented to the network using the following equation:
where h is the number of kernel functions, Φ() is the kernel function, x is the input vector, c is a vector which represents the center of the kernel function in the ndimensional space, and w_{i} are the coefficients to adapt the approximating function f(x). If these kernel functions are mapped to a neuralnetwork architecture, a threelayered network can be constructed where eachhidden node is represented by a single kernel function and the coefficients w_{i} represent the weights of the outputlayer.
When R equals 0, the kernel function is the classical Gaussian function (see figure below). A large R creates a flat top of the kernel which more and more approaches the form of a cylinder with increasing R.
The output layer of an RBF network combines the kernel function of all hidden neurons with a linearweighted sum of these functions. Depending on various parameters, the response of the network can assume virtually all thinkable shapes. Several possible response functions obtained from a network with five hidden neurons by varying the S and the R parameters are displayed below.


Home Multivariate Data Modeling Neural Networks RBF Neural Networks RBF Network as Kernel Estimator 
Last Update: 20121008