Class: Rumale::Optimizer::AdaGrad

Inherits:
Object
  • Object
show all
Includes:
Base::BaseEstimator
Defined in:
lib/rumale/optimizer/ada_grad.rb

Overview

AdaGrad is a class that implements AdaGrad optimizer.

Reference

    1. Duchi, E Hazan, and Y. Singer, “Adaptive Subgradient Methods for Online Learning and Stochastic Optimization,” J. Machine Learning Research, vol. 12, pp. 2121–2159, 2011.

Examples:

optimizer = Rumale::Optimizer::AdaGrad.new(learning_rate: 0.01, momentum: 0.9)
estimator = Rumale::LinearModel::LinearRegression.new(optimizer: optimizer, random_seed: 1)
estimator.fit(samples, values)

Instance Attribute Summary

Attributes included from Base::BaseEstimator

#params

Instance Method Summary collapse

Constructor Details

#initialize(learning_rate: 0.01) ⇒ AdaGrad

Create a new optimizer with AdaGrad.

Parameters:

  • learning_rate (Float) (defaults to: 0.01)

    The initial value of learning rate.



24
25
26
27
28
29
30
# File 'lib/rumale/optimizer/ada_grad.rb', line 24

def initialize(learning_rate: 0.01)
  check_params_float(learning_rate: learning_rate)
  check_params_positive(learning_rate: learning_rate)
  @params = {}
  @params[:learning_rate] = learning_rate
  @moment = nil
end

Instance Method Details

#call(weight, gradient) ⇒ Numo::DFloat

Calculate the updated weight with AdaGrad adaptive learning rate.

Parameters:

  • weight (Numo::DFloat)

    (shape: [n_features]) The weight to be updated.

  • gradient (Numo::DFloat)

    (shape: [n_features]) The gradient for updating the weight.

Returns:

  • (Numo::DFloat)

    (shape: [n_feautres]) The updated weight.



37
38
39
40
41
# File 'lib/rumale/optimizer/ada_grad.rb', line 37

def call(weight, gradient)
  @moment ||= Numo::DFloat.zeros(weight.shape[0])
  @moment += gradient**2
  weight - (@params[:learning_rate] / (@moment**0.5 + 1.0e-8)) * gradient
end

#marshal_dumpHash

Dump marshal data.

Returns:

  • (Hash)

    The marshal data.



45
46
47
48
# File 'lib/rumale/optimizer/ada_grad.rb', line 45

def marshal_dump
  { params: @params,
    moment: @moment }
end

#marshal_load(obj) ⇒ nil

Load marshal data.

Returns:

  • (nil)


52
53
54
55
56
# File 'lib/rumale/optimizer/ada_grad.rb', line 52

def marshal_load(obj)
  @params = obj[:params]
  @moment = obj[:moment]
  nil
end