Class: MachineLearningWorkbench::NeuralNetwork::Recurrent

Inherits:
Base
  • Object
show all
Defined in:
lib/machine_learning_workbench/neural_network/recurrent.rb

Overview

Recurrent Neural Network

Instance Attribute Summary

Attributes inherited from Base

#act_fn, #act_fn_name, #layers, #state, #struct

Instance Method Summary collapse

Methods inherited from Base

#activate, #deep_reset, #init_random, #initialize, #interface_methods, #layer_col_sizes, #layer_shapes, #lecun_hyperbolic, #load_weights, #nlayers, #nneurs, #nweights, #nweights_per_layer, #out, #relu, #reset_state, #sigmoid, #weights

Constructor Details

This class inherits a constructor from MachineLearningWorkbench::NeuralNetwork::Base

Instance Method Details

#activate_layer(nlay) ⇒ Object

Activates a layer of the network. Bit more complex since it has to copy the layer’s activation on last input to its own inputs, for recursion.

Parameters:

  • i (Integer)

    the layer to activate, zero-indexed



33
34
35
36
37
38
39
40
41
# File 'lib/machine_learning_workbench/neural_network/recurrent.rb', line 33

def activate_layer nlay
  # Mark begin and end of recursion outputs in current state
  begin_recur = nneurs(nlay)
  end_recur = nneurs(nlay) + nneurs(nlay+1)
  # Copy the level's last-time activation to the current input recurrency
  state[nlay][begin_recur...end_recur] = state[nlay+1][0...nneurs(nlay+1)]
  # Activate current layer
  act_fn.call state[nlay].dot layers[nlay]
end

#layer_row_sizesArray<Integer>

Calculate the size of each row in a layer’s weight matrix. Each row holds the inputs for the next level: previous level’s activations (or inputs), this level’s last activations (recursion) and bias.

Returns:

  • (Array<Integer>)

    per-layer row sizes



12
13
14
15
16
# File 'lib/machine_learning_workbench/neural_network/recurrent.rb', line 12

def layer_row_sizes
  @layer_row_sizes ||= struct.each_cons(2).collect do |prev, rec|
    prev + rec + 1
  end
end