from IPython.display import Image
Image('NEURAL.PNG',width=400,height=400)
Out[13]:
- We call it Deep Learning
Why we need "Deep Learning"¶
- It is very powerful when we solve complex and nonlinear problems.
- Let's loot at the (AND, NAND, OR, XOR) gate examples
Perceptron¶
- Perceptron uses a step input as activation function
- Nural network use a continuous function like sigmoid.
- But, their mechanism is similar
Image('PERCEPTRON_CONCEPT.PNG',width=200,height=200)
Out[16]:
- It is a perceptron concept
- Weight impact on each input data
- Bias activate the activation function
- Output result is made by multiplying weight and adding a bias.
import numpy as np
class _Gate_Function:
def _Perceptron(self):
w = np.array([self.w1,self.w2])
x = np.array([self.x1,self.x2])
if np.sum(x*w)+self.b >0:
return 1
else:
return 0
Let's make and, or, nand, xor gate using perceptron¶
And gate¶
Image('ANDGATE.PNG',width=400,height=400)
Out[22]:
import numpy as np
class _Gate_Function:
def __init__(self):
self.w1,self.w2,self.b,self.x1,self.x2 = 0,0,0,0,0
def _Perceptron(self):
w = np.array([self.w1,self.w2])
x = np.array([self.x1,self.x2])
if np.sum(x*w)+self.b >0:
return 1
else:
return 0
def _and(self,x1,x2):
self.w1,self.w2,self.b,self.x1,self.x2 = 0.49,0.49,-0.69,x1,x2
return self._Perceptron()
if __name__ =="__main__":
print(_Gate_Function()._and(0,0))
print(_Gate_Function()._and(0,1))
print(_Gate_Function()._and(1,0))
print(_Gate_Function()._and(1,1))
XOR gate¶
Image('ORGATE.PNG',width=400,height=400)
Out[25]:
import numpy as np
class _Gate_Function:
def __init__(self):
self.w1,self.w2,self.b,self.x1,self.x2 = 0,0,0,0,0
def _Perceptron(self):
w = np.array([self.w1,self.w2])
x = np.array([self.x1,self.x2])
if np.sum(x*w)+self.b >0:
return 1
else:
return 0
def _and(self,x1,x2):
self.w1,self.w2,self.b,self.x1,self.x2 = 0.49,0.49,-0.69,x1,x2
return self._Perceptron()
def _or(self,x1,x2):
self.w1,self.w2,self.b,self.x1,self.x2 = 0.49,0.49,-0.19,x1,x2
return self._Perceptron()
if __name__ =="__main__":
print(_Gate_Function()._or(0,0))
print(_Gate_Function()._or(0,1))
print(_Gate_Function()._or(1,0))
print(_Gate_Function()._or(1,1))
NAND gate¶
Image('NANDGATE.PNG',width=400,height=400)
Out[28]:
import numpy as np
class _Gate_Function:
def __init__(self):
self.w1,self.w2,self.b,self.x1,self.x2 = 0,0,0,0,0
def _Perceptron(self):
w = np.array([self.w1,self.w2])
x = np.array([self.x1,self.x2])
if np.sum(x*w)+self.b >0:
return 1
else:
return 0
def _and(self,x1,x2):
self.w1,self.w2,self.b,self.x1,self.x2 = 0.49,0.49,-0.69,x1,x2
return self._Perceptron()
def _or(self,x1,x2):
self.w1,self.w2,self.b,self.x1,self.x2 = 0.49,0.49,-0.19,x1,x2
return self._Perceptron()
def _nand(self,x1,x2):
self.w1,self.w2,self.b,self.x1,self.x2 = -0.49,-0.49,0.69,x1,x2
return self._Perceptron()
if __name__ =="__main__":
print(_Gate_Function()._nand(0,0))
print(_Gate_Function()._nand(0,1))
print(_Gate_Function()._nand(1,0))
print(_Gate_Function()._nand(1,1))
- However, We can't make XOR GATE..
How to make XOR GATE using perceptron¶
- If you draw a graph with two inputs of OR GATE as an axis, we could get a graph
Image('LINEAR.PNG',width=400,height=400)
Out[31]:
- As above figure, OR GATE Perceptron is an algorithm of classifying 0 and 1 using linear function.
- However,If you draw a graph with two inputs of XOR GATE as an axis
Image('NONLINEAR.PNG',width=400,height=400)
Out[34]:
- As above figure, XOR GATE Perceptron is an algorithm of classifying 0 and 1 using nonlinear function.
- So we can't solve nonlinear problem using single perceptron
- However, if we use deep perceptron, we could solve nonlinear problem
- Let's combine AND GATE, OR GATE and NAND GATE. we can make XOR GATE like below
Image('XORGATE_FIG.PNG',width=400,height=400)
Out[35]:
import numpy as np
class _Gate_Function:
def __init__(self):
self.w1,self.w2,self.b,self.x1,self.x2 = 0,0,0,0,0
def _Perceptron(self):
w = np.array([self.w1,self.w2])
x = np.array([self.x1,self.x2])
if np.sum(x*w)+self.b >0:
return 1
else:
return 0
def _and(self,x1,x2):
self.w1,self.w2,self.b,self.x1,self.x2 = 0.49,0.49,-0.69,x1,x2
return self._Perceptron()
def _or(self,x1,x2):
self.w1,self.w2,self.b,self.x1,self.x2 = 0.49,0.49,-0.19,x1,x2
return self._Perceptron()
def _nand(self,x1,x2):
self.w1,self.w2,self.b,self.x1,self.x2 = -0.49,-0.49,0.69,x1,x2
return self._Perceptron()
def _xor(self,x1,x2):
_s1 = self._nand(x1,x2)
_s2 = self._or(x1,x2)
return self._and(_s1,_s2)
if __name__ =="__main__":
print(_Gate_Function()._xor(0,0))
print(_Gate_Function()._xor(0,1))
print(_Gate_Function()._xor(1,0))
print(_Gate_Function()._xor(1,1))
No comments:
Post a Comment