Image Alt

Artificial Neural Network - 201

Artificial Neural Network - 201

Quick recap!!
In my previous article, I briefed you about “the Artificial Neural Network” and how things actually process in the background while creating an Artificial neural network.
I hope you remember about ‘the Activation Function’

An activation function is the most important portion when we talk about ‘Neurons study’. As we know an activation function is a sum of the product of all the input values and their corresponding weights which are responsible for deciding signals for a Neuron.
There are several Activation functions which can be used in multiple scenarios like where values lie between ‘-∞’(infinity) to ‘+ ∞’.
Let’s understand the four prominent activation functions that are often used in the implementation of a neural network:

1. Threshold Function:

Threshold function is the most basic activation function where in the horizontal axis we have weighted sum of the input values i.e. Y= ϕ(∑WiXi) while on the vertical axis we contain values 0 and 1.
As you noticed function f(x) consist of only two values 0 or 1 whereas if input ‘x’ is greater than or equal to 0 then the signal output is 1 else signal output is 0. You must have got an idea that a threshold function can be used where we can hold dependent variable or we can say our desired output value is a categorical one rather than a number. For e.g. Wanting to predict an output value as YES or NO.

2. Sigmoid Function:

Sigmoid function is a bit familiar with the Threshold function except having more complexity and a smoother curve than that of a threshold function. It is also used in the situation when our dependent variable is binary i.e. either 0 or 1, moreover essentially it is conventionally used when we have to find the probability of the value. If you know about regression techniques than you might have identified that a sigmoid function looks exactly like a Logistic regression curve.

3. Rectifier Function:

Rectifier function is one of the most popular function in Artificial Neural network. Value is 0 since the weighted sum of the horizontal axis is 0, once it increases the output value will also gradually increase. Rectifier function falls in the Hidden Layer (don’t worry we’ll talk about it!!!!) of an Artificial Neural Network [ANN] model.

4. Hyperbolic Tangent (tanh):

f(x) = (e^x — e^-x) / (e^x + e^-x)
Hyperbolic Tangent(tanh) is an another function which is used in the neural network and that looks similar to sigmoid function but its value ranges from -1 to 1. It is used as an alternative of the sigmoid function in order to overcome the disadvantages of sigmoid function.
So, now I hope you understood the operations and criticality of Activation functions in Artificial Neural Network models.
I’ll come back with more exciting machine learning exercises for those who seek to understand machine learning related stuff. Till then Happy Deep Learning!!!!!