[Deep Learning] 활성화 함수 구현
Sigmoid import numpy as np def sigmoid(x): return 1/(1+np.exp(-x)) sigmoid(4) 0.9820137900379085 import matplotlib.pyplot as plt x = np.arange(-10, 10, 0.01) y = sigmoid(x) plt.plot(x, y) ReLU def relu(x): return np.maximum(0, x) x = np.arange(-10, 10, 0.01) y = relu(x) plt.plot(x, y) Softmax def origin_softmax(x): f_x = np.exp(x) / np.sum(np.exp(x)) return f_x x = np.array([1.3, 5.1, 2.2, 0.7, ..
2022. 11. 17.
[PyTorch] 활성화 함수 : Non-linear Activations (weighted sum, nonlinearity)
Sigmoid (0 ~ 1) import torchimport matplotlib.pyplot as pltx = torch.linspace(-10, 10, 100)y = torch.sigmoid(x)print(x)print(y)tensor([-10.0000, -9.7980, -9.5960, -9.3939, -9.1919, -8.9899, -8.7879, -8.5859, -8.3838, -8.1818, -7.9798, -7.7778, -7.5758, -7.3737, -7.1717, -6.9697, -6.7677, -6.5657, -6.3636, -6.1616, -5.9596, -5.7576, -5.5556, -5.3535, -..
2022. 9. 24.