 
 
hello everyone in this video tutorial
we will look into another type of neural network
names radial basis function neural network
so first we will look a
brief introduction about what is the radial basis function neural network
and then we will look its application in matlab
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
the figure on the screen shows the plot of a
radial basic function so as you can see
if the input that is euclidean distance that is very large
that is Gaussian neuron that we are specify
and the input vector that are very far away
the output from the radial basis function
will be near to 0
and if they are very near to each other then it will be equal to 1
so this is the formula of our radial basis
after passing through the radial basis layer
we have our 2nd layer as a linear layer
this is same as in the case of multi layer preceptron neuron
neural network
the layer
simple inverse matrix operation
by using the learning algorithm like
radial distance and lm algorithm
 
 
 
so in this method the RBS centers
until a network is constructed
 
so all the training examples are consider as candidate for center
for the center and 1 that
mean square error or any cos function that are specified
will be selected as new hidden unit
so using this algorithm not only the
weight of the output layer are determined
also the number and the position of RBS center are determined
so coming back to matlab
this is the small code that i have written
for approximating a function
this is any random function y
and which applied with the
equal space
corresponding output we added random
then we make the use of the command new RB
this command creates a radial basis function which will
OLS learning algorithm that we have already discussed
 
so the parameters that it takes
input vector the target vector the final goal
performance that you required
then we have this spread, spread is a parameter that we define
how many Gaussian neuron
that we required to smoothly feet a function
larger the spread smother the function approximation
 
but to large a spread means a lot of neuron
are required to feet a fast changing function
to small as spread means many neuron required to fit
smooth function and thus it may
over fitting of the network that we have created
it may not generalized well
the spread term is basically Gaussian function formula
 
and for getting a good value of spread
it is recommend select as
distance between that data point that
then we select our maximum number of neuron
that we need
then this is the number of neuron
after which result will be shown on the command window
after that we use the network
and simulate the network with the input to get the results
then we have our plots
lets run this code as you can see in this figure
the red line is the good approximation of our data point
and for this we use 7 neuron
we have 1 input and 1 output
let's change the value of spread and look
what effect will it have on our result
so let's change the value of spread to 0.1
and click on run
so as you can see
instead of last time in which we have 7 neuron
in this case we have 55 neurons
and also the red line is not a good approximation
we can see that the deadline has
over fitted our data
so you can very the value of spread
and check which value is best for your data set
we also have other neural network function
for creating a radial basis function
one of them is new RB so in that case
e will represent the exact neural network
like in this case 7 neurons
in the hidden layer in that case
the number of neurons in hidden layer will be equal to the
number of samples that we have supplied
so it will be equal to the 3000 in that case
then we have the generalize regression network the case
and the figure for the same is shown on the screen
so it also contain a input matrix
and the radial basis layer but in the output layer
we have certain modification so as you can see we are using
AND product
so what is does it produce
the elements which are the dot product
of the layer weight
the vector which can obtain by passing in radial basis neural network
and all these are normalized
sum of the elements input to this plots
then we also have the probabilistic neural network
the figure of this is shown on the screen
so in this case instead of linear layer
we have the competitive layer
you can find more about these
neural network by consulting the
neural network documentation in network
so that's it everyone this is it for this video
hope you like it and please like subscribe and share
and thanks for watching
