Class: TensorStream::Train::GradientDescentOptimizer

Inherits:
Object
  • Object
show all
Includes:
OpHelper
Defined in:
lib/tensor_stream/train/gradient_descent_optimizer.rb

Overview

High Level implementation of the gradient descent algorithm

Instance Attribute Summary collapse

Instance Method Summary collapse

Methods included from OpHelper

#_op, #cons, #format_source, #fp_type?, #i_cons, #i_op, #int_type?, #reduced_shape, #shape_eval, #shape_full_specified, #shapes_fully_specified_and_equal

Constructor Details

#initialize(learning_rate, _options = {}) ⇒ GradientDescentOptimizer

Returns a new instance of GradientDescentOptimizer.



9
10
11
# File 'lib/tensor_stream/train/gradient_descent_optimizer.rb', line 9

def initialize(learning_rate, _options = {})
  @learning_rate = learning_rate
end

Instance Attribute Details

#learning_rateObject

Returns the value of attribute learning_rate.



7
8
9
# File 'lib/tensor_stream/train/gradient_descent_optimizer.rb', line 7

def learning_rate
  @learning_rate
end

Instance Method Details

#apply_gradients(grads_and_vars, global_step: nil) ⇒ Object

Apply gradients to variables. This is the second part of minimize(). It returns an Operation that applies gradients.



21
22
23
24
25
26
27
28
29
30
31
# File 'lib/tensor_stream/train/gradient_descent_optimizer.rb', line 21

def apply_gradients(grads_and_vars, global_step: nil)
  apply_ops = grads_and_vars.map do |grad, var|
    i_op(:apply_gradient_descent, var, TensorStream.cast(@learning_rate, grad.data_type), grad)
  end

  if global_step.nil?
    apply_ops
  else
    apply_ops + [global_step.assign_add(1)]
  end
end

#compute_gradients(loss, var_list: nil, grad_loss: nil) ⇒ Object

Compute gradients of loss for the variables in var_list.

This is the first part of minimize(). It returns a list of (gradient, variable) pairs where “gradient” is the gradient for “variable”.



37
38
39
40
41
42
43
44
45
46
47
48
49
50
# File 'lib/tensor_stream/train/gradient_descent_optimizer.rb', line 37

def compute_gradients(loss, var_list: nil, grad_loss: nil)
  trainable_vars = if var_list
                     raise "var_list must be an array" unless var_list.is_a?(Array)
                     var_list.each_with_index { |var, index| raise "var #{index} not a Variable" unless var.is_a?(Variable) }

                     var_list
                   else
                     loss.graph.get_collection(TensorStream::GraphKeys::TRAINABLE_VARIABLES)
                   end
  all_grads = grad_loss || TensorStream.gradients(loss, trainable_vars)
  trainable_vars.each_with_index.collect do |var, index|
    [all_grads[index], var]
  end
end

#minimize(loss, var_list: nil, grad_loss: nil, global_step: nil) ⇒ Object



13
14
15
16
# File 'lib/tensor_stream/train/gradient_descent_optimizer.rb', line 13

def minimize(loss, var_list: nil, grad_loss: nil, global_step: nil)
  grads_and_vars = compute_gradients(loss, var_list: var_list, grad_loss: grad_loss)
  apply_gradients(grads_and_vars, global_step: global_step)
end