Class: Ai4r::Som::TwoPhaseLayer

Inherits:
Layer
  • Object
show all
Defined in:
lib/ai4r/som/two_phase_layer.rb

Overview

responsible for the implementation of the algorithm’s decays, extends the class Layer. currently overrides the radius and learning rate decay methods of Layer. Has two phases, phase one has a decay in both the learning rate and the radius. The number of epochs for both phases can be passed and the total number of epochs is the sum of epoch for phase one and phase two. In the scond phase, the learning and radius decay is steady, normally set to a small number (ie. 0.01)

Parameters

  • nodes => number of nodes in the SOM (nodes x nodes). Has to be the same number

you pass to the SOM. Has to be an integer

  • radius => the initial radius for the neighborhood

  • phase_one => number of epochs for phase one, has to be an integer. By default it is set to 150

  • phase_two => number of epochs for phase two, has to be an integer. By default it is set to 100

  • learning_rate => sets the initial learning rate

  • phase_one_learning_rate => sets the learning rate for phase one

  • phase_two_learning_rate => sets the learning rate for phase two

Instance Method Summary collapse

Methods inherited from Layer

#influence_decay

Methods included from Data::Parameterizable

#get_parameters, included, #set_parameters

Constructor Details

#initialize(nodes, learning_rate = 0.9, phase_one = 150, phase_two = 100, phase_one_learning_rate = 0.1, phase_two_learning_rate = 0) ⇒ TwoPhaseLayer

Returns a new instance of TwoPhaseLayer.



36
37
38
39
40
41
42
43
44
45
46
47
48
49
# File 'lib/ai4r/som/two_phase_layer.rb', line 36

def initialize(nodes, learning_rate = 0.9, phase_one = 150, phase_two = 100,
        phase_one_learning_rate = 0.1, phase_two_learning_rate = 0)
  super nodes, nodes, phase_one + phase_two, learning_rate
  @phase_one = phase_one
  @phase_two = phase_two
  @lr = @initial_learning_rate

  @phase_one_learning_rate = phase_one_learning_rate
  @phase_two_learning_rate = phase_two_learning_rate

  @radius_reduction = @phase_one / (nodes/2.0 - 1) + 1
  @delta_lr = (@lr - @phase_one_learning_rate)/ @phase_one
  @radius = (nodes / 2.0).to_i
end

Instance Method Details

#learning_rate_decay(epoch) ⇒ Object

two different values will be returned, depending on the phase in phase one, the rate will incrementially reduced everytime this method is called on the switch of phases, the learning rate will be reset and the delta_lr (which signals the decay value of the learning rate) is reset as well in phase two, the newly reset delta_lr rate will be used to incrementially reduce the learning rate



72
73
74
75
76
77
78
79
80
81
82
83
# File 'lib/ai4r/som/two_phase_layer.rb', line 72

def learning_rate_decay(epoch)
  if epoch < @phase_one
    @lr -= @delta_lr
    return @lr
  elsif epoch == @phase_one
    @lr = @phase_one_learning_rate
    @delta_lr = (@phase_one_learning_rate - @phase_two_learning_rate)/@phase_two
    return @lr
  else
    @lr -= @delta_lr
  end
end

#radius_decay(epoch) ⇒ Object

two different values will be returned, depending on the phase in phase one, the radius will incrementially reduced by 1 every @radius_reduction time in phase two, the radius is fixed to 1



54
55
56
57
58
59
60
61
62
63
64
# File 'lib/ai4r/som/two_phase_layer.rb', line 54

def radius_decay(epoch)
  if epoch > @phase_one
    return 1
  else
    if (epoch % @radius_reduction) == 0
      @radius -= 1
    end
    @radius
  end

end