Class: MachineLearningWorkbench::Optimizer::NaturalEvolutionStrategies::Base

Inherits:
Object
  • Object
show all
Defined in:
lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb

Overview

Natural Evolution Strategies base class

Direct Known Subclasses

BDNES, RNES, SNES, XNES

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(ndims, obj_fn, opt_type, rseed: nil, mu_init: 0, sigma_init: 1, parallel_fit: false, rescale_popsize: 1, rescale_lrate: 1, dtype: :float64) ⇒ Base

NES object initialization

Parameters:

  • ndims (Integer)

    number of parameters to optimize

  • obj_fn (#call)

    any object defining a #call method (Proc, lambda, custom class)

  • opt_type (:min, :max)

    select minimization / maximization of obj_fn

  • rseed (Integer) (defaults to: nil)

    allow for deterministic execution on rseed provided

  • mu_init (Numeric) (defaults to: 0)

    values to initalize the distribution’s mean

  • sigma_init (Numeric) (defaults to: 1)

    values to initialize the distribution’s covariance

  • parallel_fit (boolean) (defaults to: false)

    whether the ‘obj_fn` should be passed all the individuals together. In the canonical case the fitness function always scores a single individual; in practical cases though it is easier to delegate the scoring parallelization to the external fitness function. Turning this to `true` will make the algorithm pass _an Array_ of individuals to the fitness function, rather than a single instance.

  • rescale_popsize (Float) (defaults to: 1)

    scaling for the default population size

  • rescale_lrate (Float) (defaults to: 1)

    scaling for the default learning rate

  • dtype (NMatrix dtype) (defaults to: :float64)

    NMatrix dtype for all matrix computation

Raises:

  • (ArgumentError)


23
24
25
26
27
28
29
30
31
32
33
34
35
36
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 23

def initialize ndims, obj_fn, opt_type, rseed: nil, mu_init: 0, sigma_init: 1, parallel_fit: false, rescale_popsize: 1, rescale_lrate: 1, dtype: :float64
  raise ArgumentError unless [:min, :max].include? opt_type
  raise ArgumentError unless obj_fn.respond_to? :call
  @ndims, @opt_type, @obj_fn, @parallel_fit = ndims, opt_type, obj_fn, parallel_fit
  @rescale_popsize, @rescale_lrate = rescale_popsize, rescale_lrate
  @id = NMatrix.identity(ndims, dtype: dtype)
  rseed ||= Random.new_seed
  # puts "NES rseed: #{s}"  # currently disabled
  @rng = Random.new rseed
  @best = [(opt_type==:max ? -1 : 1) * Float::INFINITY, nil]
  @last_fits = []
  @dtype = dtype
  initialize_distribution mu_init: mu_init, sigma_init: sigma_init
end

Instance Attribute Details

#bestObject (readonly)

Returns the value of attribute best.



5
6
7
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 5

def best
  @best
end

#dtypeObject (readonly)

Returns the value of attribute dtype.



5
6
7
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 5

def dtype
  @dtype
end

#idObject (readonly)

Returns the value of attribute id.



5
6
7
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 5

def id
  @id
end

#last_fitsObject (readonly)

Returns the value of attribute last_fits.



5
6
7
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 5

def last_fits
  @last_fits
end

#muObject (readonly)

Returns the value of attribute mu.



5
6
7
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 5

def mu
  @mu
end

#ndimsObject (readonly)

Returns the value of attribute ndims.



5
6
7
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 5

def ndims
  @ndims
end

#obj_fnObject (readonly)

Returns the value of attribute obj_fn.



5
6
7
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 5

def obj_fn
  @obj_fn
end

#opt_typeObject (readonly)

Returns the value of attribute opt_type.



5
6
7
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 5

def opt_type
  @opt_type
end

#parallel_fitObject (readonly)

Returns the value of attribute parallel_fit.



5
6
7
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 5

def parallel_fit
  @parallel_fit
end

#rescale_lrateObject (readonly)

Returns the value of attribute rescale_lrate.



5
6
7
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 5

def rescale_lrate
  @rescale_lrate
end

#rescale_popsizeObject (readonly)

Returns the value of attribute rescale_popsize.



5
6
7
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 5

def rescale_popsize
  @rescale_popsize
end

#rngObject (readonly)

Returns the value of attribute rng.



5
6
7
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 5

def rng
  @rng
end

#sigmaObject (readonly)

Returns the value of attribute sigma.



5
6
7
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 5

def sigma
  @sigma
end

Instance Method Details

#cmaes_lrateNMatrix, Float

Magic numbers from CMA-ES (TODO: add proper citation)

Returns:

  • (NMatrix)

    scale-invariant utilities

  • (Float)

    learning rate lower bound



71
72
73
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 71

def cmaes_lrate
  (3+Math.log(ndims)) / (5*Math.sqrt(ndims))
end

#cmaes_popsizeNMatrix, Integer

Magic numbers from CMA-ES (TODO: add proper citation)

Returns:

  • (NMatrix)

    scale-invariant utilities

  • (Integer)

    population size lower bound



77
78
79
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 77

def cmaes_popsize
  [5, 4 + (3*Math.log(ndims)).floor].max
end

#cmaes_utilitiesNMatrix

Magic numbers from CMA-ES (TODO: add proper citation)

Returns:

  • (NMatrix)

    scale-invariant utilities



57
58
59
60
61
62
63
64
65
66
67
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 57

def cmaes_utilities
  # Algorithm equations are meant for fitness maximization
  # Match utilities with individuals sorted by INCREASING fitness
  log_range = (1..popsize).collect do |v|
    [0, Math.log(popsize.to_f/2 - 1) - Math.log(v)].max
  end
  total = log_range.reduce(:+)
  buf = 1.0/popsize
  vals = log_range.collect { |v| v / total - buf }.reverse
  NMatrix[vals, dtype: dtype]
end

#interface_methodsObject

Declaring interface methods - implement these in child class!



119
120
121
122
123
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 119

[:train, :initialize_distribution, :convergence].each do |mname|
  define_method mname do
    raise NotImplementedError, "Implement in child class!"
  end
end

#lrateObject

Memoized automatic magic numbers NOTE: Doubling popsize and halving lrate often helps



53
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 53

def lrate;   @lrate     ||= cmaes_lrate * rescale_lrate end

#move_inds(inds) ⇒ NMatrix

Move standard normal samples to current distribution

Returns:

  • (NMatrix)

    individuals



90
91
92
93
94
95
96
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 90

def move_inds inds
  # TODO: can we reduce the transpositions?
  # sigma.dot(inds.transpose).map(&mu.method(:+)).transpose
  multi_mu = NMatrix[*inds.rows.times.collect {mu.to_a}, dtype: dtype].transpose
  (multi_mu + sigma.dot(inds.transpose)).transpose
  # sigma.dot(inds.transpose).transpose + inds.rows.times.collect {mu.to_a}.to_nm
end

#popsizeObject

Memoized automatic magic numbers NOTE: Doubling popsize and halving lrate often helps



51
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 51

def popsize; @popsize   ||= cmaes_popsize * rescale_popsize end

#sorted_indsObject

Sorted individuals NOTE: Algorithm equations are meant for fitness maximization. Utilities need to be matched with individuals sorted by INCREASING fitness. Then reverse order for minimization.

Returns:

  • standard normal samples sorted by the respective individuals’ fitnesses



102
103
104
105
106
107
108
109
110
111
112
113
114
115
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 102

def sorted_inds
  samples = standard_normal_samples
  inds = move_inds(samples).to_a
  fits = parallel_fit ? obj_fn.call(inds) : inds.map(&obj_fn)
  # Quick cure for NaN fitnesses
  fits.map! { |x| x.nan? ? (opt_type==:max ? -1 : 1) * Float::INFINITY : x }
  @last_fits = fits # allows checking for stagnation
  sorted = [fits, inds, samples.to_a].transpose.sort_by(&:first)
  sorted.reverse! if opt_type==:min
  this_best = sorted.last.take(2)
  opt_cmp_fn = opt_type==:min ? :< : :>
  @best = this_best if this_best.first.send(opt_cmp_fn, best.first)
  NMatrix[*sorted.map(&:last), dtype: dtype]
end

#standard_normal_sampleFloat

Box-Muller transform: generates standard (unit) normal distribution samples

Returns:

  • (Float)

    a single sample from a standard normal distribution



40
41
42
43
44
45
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 40

def standard_normal_sample
  rho = Math.sqrt(-2.0 * Math.log(rng.rand))
  theta = 2 * Math::PI * rng.rand
  tfn = rng.rand > 0.5 ? :cos : :sin
  rho * Math.send(tfn, theta)
end

#standard_normal_samplesNMatrix

Samples a standard normal distribution to construct a NMatrix of

popsize multivariate samples of length ndims

Returns:

  • (NMatrix)

    standard normal samples



84
85
86
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 84

def standard_normal_samples
  NMatrix.new([popsize, ndims], dtype: dtype) { standard_normal_sample }
end

#utilsObject

Memoized automatic magic numbers NOTE: Doubling popsize and halving lrate often helps



49
# File 'lib/machine_learning_workbench/optimizer/natural_evolution_strategies/base.rb', line 49

def utils;   @utilities ||= cmaes_utilities end