728x90
반응형
Simple Neural Network Structure

 

> Neurons

- Input → Operator → Activation Function → Output

 

### Simple Neural Network Structure ###

def sigmoid(x):
    return (1 / (1 + np.exp(-x)))

def Neuron(x, W, bias=0):
    z = x * W + bias
    return sigmoid(z)

x = tf.random.normal((1, 2), 0, 1)
W = tf.random.normal((1, 2), 0, 1)

print('x.shape: ', x.shape)
print('W.shape: ', W.shape)

print(x)
print(W)

print(Neuron(x, W))

 

Perceptron learning algorithm (weight update)

 

### Perceptron learning algorithm (weight update) ###

x = 1
y = 0
W = tf.random.normal([1], 0, 1)
print(Neuron(x, W))
print('y: ', y)

for i in range(1000):
    output = Neuron(x, W)
    error = y - output
    W = W + x * 0.1 * error # Learning rate : 0.1
    
    if i % 100 == 99:
        print("{}\t{}\t{}" .format(i+1, error, output))
        
# 100     [-0.08537974]   [0.08537974]
# 200     [-0.04736642]   [0.04736642]
# 300     [-0.03253129]   [0.03253129]
# 400     [-0.02470946]   [0.02470946]
# 500     [-0.01989669]   [0.01989669]
# 600     [-0.01664264]   [0.01664264]
# 700     [-0.01429797]   [0.01429797]
# 800     [-0.01252933]   [0.01252933]
# 900     [-0.01114822]   [0.01114822]
# 1000    [-0.01004019]   [0.01004019]

def Neuron2(x, W, bias=0):
    z = tf.matmul(x, W, tranpose_a=True) + bias
    return sigmoid(z)

x = tf.random.normal((1, 3), 0, 1)
y = tf.ones(1)
W = tf.random.normal((1, 3), 0, 1)

print(Neuron2(x, W))
print("y: ", y)
####.......
728x90
반응형

'Deep Learning' 카테고리의 다른 글

Library_Pandas_1  (0) 2022.09.14
Simply Perceptron Implement for TF (CSV)  (0) 2022.09.13
TensorFlow  (0) 2022.09.12
Numpy  (0) 2022.09.12
How To Build An Artificial Neural Network With Python  (0) 2022.09.12

+ Recent posts