Module: Qoa::ActivationFunctions
- Defined in:
- lib/qoa/activation_functions.rb
Class Method Summary collapse
- .elu(x, alpha = 1.0) ⇒ Object
- .elu_derivative(x, alpha = 1.0) ⇒ Object
- .leaky_relu(x, alpha = 0.01) ⇒ Object
- .leaky_relu_derivative(x, alpha = 0.01) ⇒ Object
- .relu(x) ⇒ Object
- .relu_derivative(x) ⇒ Object
- .sigmoid(x) ⇒ Object
- .sigmoid_derivative(x) ⇒ Object
- .softmax(x) ⇒ Object
- .softmax_derivative(x) ⇒ Object
- .swish(x, beta = 1.0) ⇒ Object
- .swish_derivative(x, beta = 1.0) ⇒ Object
- .tanh(x) ⇒ Object
- .tanh_derivative(x) ⇒ Object
Class Method Details
.elu(x, alpha = 1.0) ⇒ Object
36 37 38 |
# File 'lib/qoa/activation_functions.rb', line 36 def elu(x, alpha = 1.0) x < 0 ? (alpha * (Math.exp(x) - 1)) : x end |
.elu_derivative(x, alpha = 1.0) ⇒ Object
40 41 42 |
# File 'lib/qoa/activation_functions.rb', line 40 def elu_derivative(x, alpha = 1.0) x < 0 ? (alpha * Math.exp(x)) : 1.0 end |
.leaky_relu(x, alpha = 0.01) ⇒ Object
28 29 30 |
# File 'lib/qoa/activation_functions.rb', line 28 def leaky_relu(x, alpha = 0.01) x < 0 ? (alpha * x) : x end |
.leaky_relu_derivative(x, alpha = 0.01) ⇒ Object
32 33 34 |
# File 'lib/qoa/activation_functions.rb', line 32 def leaky_relu_derivative(x, alpha = 0.01) x < 0 ? alpha : 1.0 end |
.relu(x) ⇒ Object
20 21 22 |
# File 'lib/qoa/activation_functions.rb', line 20 def relu(x) x < 0 ? 0 : x end |
.relu_derivative(x) ⇒ Object
24 25 26 |
# File 'lib/qoa/activation_functions.rb', line 24 def relu_derivative(x) x < 0 ? 0 : 1.0 end |
.sigmoid(x) ⇒ Object
4 5 6 |
# File 'lib/qoa/activation_functions.rb', line 4 def sigmoid(x) 1.0 / (1.0 + Math.exp(-x)) end |
.sigmoid_derivative(x) ⇒ Object
8 9 10 |
# File 'lib/qoa/activation_functions.rb', line 8 def sigmoid_derivative(x) x * (1.0 - x) end |
.softmax(x) ⇒ Object
52 53 54 55 56 |
# File 'lib/qoa/activation_functions.rb', line 52 def softmax(x) exps = x.map { |e| Math.exp(e - x.max) } sum = exps.inject(:+) exps.map { |e| e / sum } end |
.softmax_derivative(x) ⇒ Object
58 59 60 |
# File 'lib/qoa/activation_functions.rb', line 58 def softmax_derivative(x) x.map { |e| e * (1 - e) } end |
.swish(x, beta = 1.0) ⇒ Object
44 45 46 |
# File 'lib/qoa/activation_functions.rb', line 44 def swish(x, beta = 1.0) x * sigmoid(beta * x) end |
.swish_derivative(x, beta = 1.0) ⇒ Object
48 49 50 |
# File 'lib/qoa/activation_functions.rb', line 48 def swish_derivative(x, beta = 1.0) swish(x, beta) + sigmoid(beta * x) * (1 - swish(x, beta)) end |
.tanh(x) ⇒ Object
12 13 14 |
# File 'lib/qoa/activation_functions.rb', line 12 def tanh(x) Math.tanh(x) end |
.tanh_derivative(x) ⇒ Object
16 17 18 |
# File 'lib/qoa/activation_functions.rb', line 16 def tanh_derivative(x) 1.0 - (tanh(x) ** 2.0) end |