Friday, March 13, 2020

[Deep Learning Tutorial] 02 How to make Artificial Neural Network!?(machine learning which algorithm is explained by python programming)

02_How_to_make_artificial_neural_network

How to make Artificial Neural Network

  • Hello, I'm DVM TV who is talking about Deep Learning versus Machine Learning.
  • Today, I will talk about how to make Artificial Neural Network
  • Artificial Neural Network is very simple
  • If you make neural layer above 3, It is Deep Neural Network.
  • So Let's make Artificial Neural Network.
In [4]:
from IPython.display import Image
In [41]:
Image("NETWORK.png",height=300,width=300)
Out[41]:
  • "X" is called input data such as image or natural language and so on
  • "W" is called weight. It means how much the input value affects the result
  • "B" is called bias. It decide whether to activate input values multiplied by weight.
  • "A" is called activation function. Let's see what kind of activation function are.

Activation Function

What's the step input function

  • if we use step input function at hidden layer, It is not neural net but perceptron.
In [42]:
import numpy as np
import matplotlib.pylab as plt
In [43]:
x_value = np.arange(-1,1,0.01)
def _step_input(x):
    return np.array(x>0, dtype = np.int)
plt.plot(x_value,_step_input(x_value))
plt.grid()

What's the sigmoid function

  • If we classify 2 class, we use sigmoid function at output layer
In [44]:
def _sigmoid(x):
    return 1/(1+np.exp(-x))
plt.plot(x_value,_sigmoid(x_value))
plt.grid()

What's the RELU function

  • Relu function is used instead of sigmoid function.
In [45]:
def _relu(x):
    return np.maximum(0,x)
plt.plot(x_value,_relu(x_value))
plt.grid()

What's the Identification Fucntion

  • When we solve the regression problem, We use the identification fucnction at output layer
In [46]:
def _identification(x):
    return x
plt.plot(x_value,_identificationfication(x_value))
plt.grid()

What's the Softmax Function?

  • When we solve the multi-classificatmion problem, We use the softmax function
In [48]:
_input = np.array([0.1,0.2,0.3])
def _normal_softmax():
    return np.exp(_input)/np.sum(np.exp(_input))
_normal_softmax()
Out[48]:
array([0.30060961, 0.33222499, 0.3671654 ])
  • But, normal softmax are likely to cause error
  • Because if exponential function has 0 input, exponential has a negative infinity
  • This can be solved by multiplying the largest input value to the denominator and numerator
In [49]:
_input = np.array([0.1,0.2,0.3])
_max = np.max(_input)def _advanced_softmax():
    return np.exp(_input-_max)/np.sum(np.exp(_input-_max))
_advanced_softmax()
Out[49]:
array([0.30060961, 0.33222499, 0.3671654 ])

Let's make the Artificial Neural Network like this figure

In [40]:
Image("NETWORK.png",height=300,width=300)
Out[40]:
In [53]:
class _artificial():
    def __init__(self):
        self.w1 = np.array([[0.1,0.2,0.3],
                            [0.1,0.2,0.3]]) #2x3
        self.b1 = np.array([0.1,0.2,0.3])   #1x3
        self.w2 = np.array([[0.1,0.2,0.3],
                            [0.1,0.2,0.3],
                            [0.1,0.2,0.3]])#3x3
        self.b2 = np.array([0.1,0.2,0.3])   #1x3
        self.w3 = np.array([[0.1,0.2],
                            [0.1,0.2],
                            [0.1,0.2]])#3x2
        self.b3 = np.array([0.1,0.2])#1x2
    
    def _step_input(self,x):
        return np.array(x>0, dtype = np.int)
    
    def _sigmoid(self,x):
        return 1/(1+np.exp(-x))
    
    def _identification(self,x):
        return x
    
    def _relu(self,x):
        return np.maximum(0,x)
    
    def _feedforward(self,x):
        a1 = self._step_input(np.dot(x,self.w1)+self.b1)
        a2 = self._relu(np.dot(a1,self.w2)+self.b2)
        a3 = self._sigmoid(np.dot(a2,self.w3)+self.b3)
        return a3
if __name__ == "__main__":
    result = _artificial()._feedforward(np.array([1.0,0.5]))
    print(result)
[0.58419052 0.6637387 ]
  • This network is called by fully connected layer which is most popular network

No comments:

Post a Comment