Class: DNN::Optimizers::Optimizer

Inherits:
Object
  • Object
show all
Defined in:
lib/dnn/core/optimizers.rb

Overview

Super class of all optimizer classes.

Direct Known Subclasses

AdaDelta, AdaGrad, Adam, RMSProp, SGD

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(learning_rate) ⇒ Optimizer

Returns a new instance of Optimizer.



8
9
10
# File 'lib/dnn/core/optimizers.rb', line 8

def initialize(learning_rate)
  @learning_rate = learning_rate
end

Instance Attribute Details

#learning_rateObject

Returns the value of attribute learning_rate.



6
7
8
# File 'lib/dnn/core/optimizers.rb', line 6

def learning_rate
  @learning_rate
end

Instance Method Details

#to_hash(merge_hash = nil) ⇒ Object



18
19
20
21
22
# File 'lib/dnn/core/optimizers.rb', line 18

def to_hash(merge_hash = nil)
  hash = {class: self.class.name, learning_rate: @learning_rate}
  hash.merge!(merge_hash) if merge_hash
  hash
end

#update(params) ⇒ Object

Update params. Classes that inherit from this class must implement this method.

Raises:

  • (NotImplementedError)


14
15
16
# File 'lib/dnn/core/optimizers.rb', line 14

def update(params)
  raise NotImplementedError.new("Class '#{self.class.name}' has implement method 'update'")
end