Class: Ai4r::Som::TwoPhaseLayer
- Defined in:
- lib/ai4r/som/two_phase_layer.rb
Overview
Layer that trains in two distinct phases with different learning rates.
Instance Method Summary collapse
- #initialize(nodes, learning_rate = 0.9, phase_one = 150, phase_two = 100, phase_one_learning_rate = 0.1, phase_two_learning_rate = 0, options = {}) ⇒ Object constructor
-
#learning_rate_decay(epoch) ⇒ Object
two different values will be returned, depending on the phase in phase one, the rate will incrementially reduced everytime this method is called on the switch of phases, the learning rate will be reset and the delta_lr (which signals the decay value of the learning rate) is reset as well in phase two, the newly reset delta_lr rate will be used to incrementially reduce the learning rate.
-
#radius_decay(epoch) ⇒ Object
two different values will be returned, depending on the phase in phase one, the radius will incrementially reduced by 1 every @radius_reduction time in phase two, the radius is fixed to 1.
Methods inherited from Layer
Methods included from Data::Parameterizable
#get_parameters, included, #set_parameters
Constructor Details
#initialize(nodes, learning_rate = 0.9, phase_one = 150, phase_two = 100, phase_one_learning_rate = 0.1, phase_two_learning_rate = 0, options = {}) ⇒ Object
37 38 39 40 41 42 43 44 45 46 47 48 49 50 |
# File 'lib/ai4r/som/two_phase_layer.rb', line 37 def initialize(nodes, learning_rate = 0.9, phase_one = 150, phase_two = 100, phase_one_learning_rate = 0.1, phase_two_learning_rate = 0, = {}) super(nodes, nodes, phase_one + phase_two, learning_rate, ) @phase_one = phase_one @phase_two = phase_two @lr = @initial_learning_rate @phase_one_learning_rate = phase_one_learning_rate @phase_two_learning_rate = phase_two_learning_rate @radius_reduction = (@phase_one / ((nodes / 2.0) - 1)) + 1 @delta_lr = (@lr - @phase_one_learning_rate) / @phase_one @radius = (nodes / 2.0).to_i end |
Instance Method Details
#learning_rate_decay(epoch) ⇒ Object
two different values will be returned, depending on the phase in phase one, the rate will incrementially reduced everytime this method is called on the switch of phases, the learning rate will be reset and the delta_lr (which signals the decay value of the learning rate) is reset as well in phase two, the newly reset delta_lr rate will be used to incrementially reduce the learning rate
72 73 74 75 76 77 78 79 80 81 82 83 |
# File 'lib/ai4r/som/two_phase_layer.rb', line 72 def learning_rate_decay(epoch) if epoch < @phase_one @lr -= @delta_lr @lr elsif epoch == @phase_one @lr = @phase_one_learning_rate @delta_lr = (@phase_one_learning_rate - @phase_two_learning_rate) / @phase_two @lr else @lr -= @delta_lr end end |
#radius_decay(epoch) ⇒ Object
two different values will be returned, depending on the phase in phase one, the radius will incrementially reduced by 1 every @radius_reduction time in phase two, the radius is fixed to 1
57 58 59 60 61 62 |
# File 'lib/ai4r/som/two_phase_layer.rb', line 57 def radius_decay(epoch) return 1 if epoch > @phase_one @radius -= 1 if (epoch % @radius_reduction).zero? @radius end |