[ANN] 신경망 구현
단층 신경망 import numpy as np inputs = np.array([0.5, -0.3]) weights = np.array([0.4, 0.6]) bias = -0.5 y = sigmoid(np.dot(inputs, weights.T) + bias) print(y) 다층 신경망 inputs = [1.0, 0.5] # 1x2 행렬 w1 = np.array([[0.1, 0.2, 0.3], [0.2, 0.3, 0.4]]) # 2x3 행렬 b1 = np.array([0.2, 0.3, 0.4]) # 3개 노드 w2 = np.array([[0.1, 0.2], [0.3, 0.2], [0.3, 0.4]]) # 3x2행렬 b2 = np.array([0.1, 0.2]) # 2개 노드 w3 = np.array([..
2022. 11. 17.
[Deep Learning] 활성화 함수 구현
Sigmoid import numpy as np def sigmoid(x): return 1/(1+np.exp(-x)) sigmoid(4) 0.9820137900379085 import matplotlib.pyplot as plt x = np.arange(-10, 10, 0.01) y = sigmoid(x) plt.plot(x, y) ReLU def relu(x): return np.maximum(0, x) x = np.arange(-10, 10, 0.01) y = relu(x) plt.plot(x, y) Softmax def origin_softmax(x): f_x = np.exp(x) / np.sum(np.exp(x)) return f_x x = np.array([1.3, 5.1, 2.2, 0.7, ..
2022. 11. 17.
[ANN] 퍼셉트론 연산
퍼셉트론 연산 import numpy as np grade = 10 test = 20 # 1x2 input0 = np.array([grade, test]) # 2x2 w1 = np.array( [[0.5, 0.12], [0.3, 0.4]]) # 2x1 w2 = np.array([[0.4], [0.5]]) result_0 = np.dot(input0, w1) result_1 = np.dot(result_0, w2) print('result_0:', result_0) print('result_1:', result_1) result_0: [11. 9.2] result_1: [9.] step 함수 (기본 활성화 함수) def step(h): return np.array(h >= 0, dtype = "int") ..
2022. 11. 10.