Hello geeks, welcome back
we saw feed forward neural network, in the last video
in this video, we'll see implementation of XOR and XNOR
logical functions using neural network
actually , this example is quite popular and easy to understand
so,we'll use these functions to understand the basics of neural nets
before moving further
lets see how does XOR work
as , you can see here. If inputs are zero and zero
then output will be zero
and 0 & 1 means output will be 1
same goes for input 1 and 0
but if input is 1 and 1, then output will be zero
it means ,if  both inputs are same , then
output will be zero
such as (0 & 0) and (1 & 1)
if inputs are different then output will be 1
XNOR is simply
negation of XOR
It means, output of XNOR will be 1
where output of XOR was 0
similarly output of XNOR will be 1 where XOR was 0
and this is our sigmoid function
we are going to use this function in every neuron
we've already discussed this thing
sigmoid basically converts into probabilities
or specifically , a real no. between 0 and 1
This is the formula of sigmoid function
 
Look closely , you 'll get to know that
function always tends to 1
for no. greater than or equal to 5.
similarly, function always tends to 0
for no. less than or equal to -5.(minus fiive)
i hope you are aware of
logical OR
Logical AND
and LOgical NOT operation
Lets see some more basic neural nets
before XOR and XNOR .
Like Logical AND
in this neural net,
these are two inputs x1 and x2
This is the Bias term, and this is the truth table of logical operation
lets say x1 is 0 and x2 = 0
 
we will multiply them with 20 and 20 correspondingly
then add bias  element
which is -30(minus thirty)
feed the -30 to sigmoid function , result will be zero
Now check this neural net . Here weights and biases  are different
along with that its result will be different
consider x1 equals to 1
and x2 equals to 1
Now, multiply these no.s by 20 and 20 respectively
 
and add bias , which is -10
in this case, final output
will be one.
as you can see in the truth table
of logical OR function. one and one gives result one.
Negation of Logical OR
and ((NOT x1) AND (NOT x2))
 
Both are same , because of Demorgan's law
I am sure, you have already studied . you can check this one
 
suppose x1 = 1 and x2= 0
1 will be multiplied by   -20
similarly 0 is multiplied  by -20
 
because of these weights
-20 and -20
plus
10 , which is our bias term
and
we'll get -10 after solving this
then we'll pass -10 to sigmoid function
i have already discussed that
sigmoid function tends to 1 for
no.s greater than or equal to +5
function tends to 0 for no.s less than -5
here value is -10
so, answer will be 0
Now, the final step , let me tell you
XOR and XNOR can not be calculated by
Neural net of zero hidden layers
like Logical OR, Logical AND and Logical negation OR
because it is slightly complex function
so, we'll be needing  atleast one hidden layer
to learn this function
check out this image
this neuron and these arrows are
AND  function
this neuron and these arrows
 
in cyan color is ((NOT x1) AND (NOT x2)) function
 
and this neuron and these arrows
in green actually are  OR function
 
as i have already told in my last video that
this is our input layer
and if  this is our
output layer,
then layers in the middle ,
which is neither input layer nor output layer
we call them, hidden layers
i mean to say,  if  we create a hidden layer of Logical AND
and Logical negation OR
and output layer of Logical OR
keeping the input layer same
then we will get our desired result
you can check it out
let say x1 = 0 and x2 = 1
then , result of XOR should be 1
but we are calculating XNOR , so result should be ZERO
zero multiplied by 20 plus one multiplied by 20
we are doing AND operation on input variables
20 and 20
plus -30 , which results in -10
and then we apply sigmoid function
and result will be 0 for first activation neuron
similarly 0 is multiplied by -20
and 1 is multiplied by -20
plus our bias term , which is   10
intermediate result is -10
and
final output will be zero
after passing through sigmoid function
it means, input of 2nd activation neuron is zero
Now, we have zero and zero
in both the neurons
we are going to multiply 20 and 20 respectively
and then we'll add minus 10
as you can see our result
is similar to the previous one
and our final result is zero.
