In this video, we will explore the concept of threshold functions,
types of threshold functions, and
logic gates that are used in artificial neural networks.
A threshold function is considered as one of the key components
of artificial neural networks.
And the objective of threshold is to determine whether the neurons
are activated or not.
Threshold functions typically introduces nonlinear properties into an artificial
neural network by calculating the weighted sum and adding direction.
This, in turn,
will help us decide whether to activate a particular neuron or not.
Threshold functions are generally used in computation systems that
are essentially based on biological neural networks.
It is also used to quantify the output of neurons in the output layer.
These quantified values are used when we perform classification
in neural networks.
Now, the threshold function is a Boolean function
which we use to determine if the value of the input exceeds and
identified threshold which is one of the series of activation functions
that is prominently used in artificial neural networks.
Now, that we are aware of the concept of threshold functions and
their utilization, let's explore the prominent types of threshold functions.
There are five types of threshold functions that we can use in
artificial neural networks and they include Unit step,
Sigmoid, Piecewise linear, Gaussian, and Linear.
In the case of unit step, the output is set at one of two levels, greater than,
or lesser than.
We generally use this function to determine whether the total output
is greater than or lesser than the predefined threshold value.
The second type of threshold value, sigmoid, typically contains two addition
functions, namely a logistic function and a tangential function.
These two functions are different from the perspective of their range of values.
The value of logistic function ranges from 0 and 1, and
the value of tangential function ranges from -1 to +1.
Piecewise linear is a third type of threshold function,
where the output is directionally proportional to the weighted output.
Gaussian functions are bell-shaped continuous curves.
The neuron values can be low or high.
The current level is interpreted depending on how close the neuron
input is chosen to the identified value after averaging.
Finally, the linear function is quite similar to linear regression.
And when we use linear activation function,
it performs the weighted sum inputs of neurons into an output.
Now understanding logic gates and
their application to artificial neural networks is essential.
Logic gates can help us derive essential values and
models that we can use to build or implement neural network models.
The three sample illustration figures depict the application of AND,
OR, and XNOR gates.
The first sample figure located on the left
illustrates the application of the AND gate.
We can observe that we get the output value one
when both the input values are one.
The second sample figure located in the middle
illustrates the application of the OR gate, and we can observe that we
get the output value one whenever one of the input value is one.
Finally, the third sample figure,
located on the right, illustrates the application of XNOR gate.
And we can clearly observe that we get the output value 1 whenever both
the input values are the same.
By using these logic gates on neurons, we would be able to derive simple models
for complex artificial neural network scenarios.
