Class: DNN::Optimizers::RMSProp
- Defined in:
- lib/dnn/core/optimizers.rb
Instance Attribute Summary collapse
-
#muse ⇒ Object
Returns the value of attribute muse.
Attributes inherited from Optimizer
Class Method Summary collapse
Instance Method Summary collapse
-
#initialize(learning_rate = 0.001, muse = 0.9) ⇒ RMSProp
constructor
A new instance of RMSProp.
- #to_hash ⇒ Object
- #update(layer) ⇒ Object
Constructor Details
#initialize(learning_rate = 0.001, muse = 0.9) ⇒ RMSProp
Returns a new instance of RMSProp.
85 86 87 88 89 |
# File 'lib/dnn/core/optimizers.rb', line 85 def initialize(learning_rate = 0.001, muse = 0.9) super(learning_rate) @muse = muse @g = {} end |
Instance Attribute Details
#muse ⇒ Object
Returns the value of attribute muse.
83 84 85 |
# File 'lib/dnn/core/optimizers.rb', line 83 def muse @muse end |
Class Method Details
.load_hash(hash) ⇒ Object
91 92 93 |
# File 'lib/dnn/core/optimizers.rb', line 91 def self.load_hash(hash) self.new(hash[:learning_rate], hash[:muse]) end |
Instance Method Details
#to_hash ⇒ Object
104 105 106 107 108 109 110 |
# File 'lib/dnn/core/optimizers.rb', line 104 def to_hash { name: self.class.name, learning_rate: @learning_rate, muse: @muse, } end |
#update(layer) ⇒ Object
95 96 97 98 99 100 101 102 |
# File 'lib/dnn/core/optimizers.rb', line 95 def update(layer) @g[layer] ||= {} layer.params.each_key do |key| @g[layer][key] ||= 0 @g[layer][key] = @muse * @g[layer][key] + (1 - @muse) * layer.grads[key]**2 layer.params[key] -= (@learning_rate / NMath.sqrt(@g[layer][key] + 1e-7)) * layer.grads[key] end end |