Class: TensorStream::Train::AdagradOptimizer

Inherits:
Optimizer
  • Object
show all
Includes:
OpHelper
Defined in:
lib/tensor_stream/train/adagrad_optimizer.rb

Overview

High Level implementation of the Adagrad algorithm

Instance Attribute Summary collapse

Attributes inherited from Optimizer

#name

Instance Method Summary collapse

Methods included from OpHelper

#_op, #cons, #format_source, #fp_type?, #i_cons, #i_op, #i_var, #int_type?, #reduced_shape, #shape_eval, #shape_full_specified, #shapes_fully_specified_and_equal

Methods inherited from Optimizer

#apply_gradients, #compute_gradients, #get_slot, #get_slot_names, #minimize

Methods included from SlotCreator

#create_slot, #create_slot_var, #create_slot_with_initializer, #create_zeros_slot

Methods included from Utils

#__v_scope_name, #apply_data_type_coercion, #assign, #check_allowed_types, #check_data_types, #check_if_dense, #colocate_with, #constant, #control_dependencies, #convert_to_tensor, #device, #disable_eager_execution, #dynamic_stitch, #enable_eager_execution, #executing_eagerly?, #float32, #get_collection, #get_default_graph, #get_variable, #get_variable_scope, #global_variables_initializer, #graph, #group, #image, #layers, #list_local_devices, #math, #name_scope, #placeholder, #program, #reset_default_graph, #session, #set_random_seed, #train, #trainable_variables, #variable, #variable_scope

Constructor Details

#initialize(learning_rate, initial_accumulator_value = 0.1, use_locking: false, name: "Adagrad") ⇒ AdagradOptimizer

Returns a new instance of AdagradOptimizer.



9
10
11
12
13
14
15
# File 'lib/tensor_stream/train/adagrad_optimizer.rb', line 9

def initialize(learning_rate, initial_accumulator_value = 0.1,
  use_locking: false, name: "Adagrad")
  @learning_rate = learning_rate
  @initial_accumulator_value = initial_accumulator_value
  @learning_rate_tensor = nil
  super(name: name, use_locking: use_locking)
end

Instance Attribute Details

#learning_rateObject

Returns the value of attribute learning_rate.



7
8
9
# File 'lib/tensor_stream/train/adagrad_optimizer.rb', line 7

def learning_rate
  @learning_rate
end