Class: MachineLearningWorkbench::NeuralNetwork::Base

Inherits:
Object
  • Object
show all
Defined in:
lib/machine_learning_workbench/neural_network/base.rb

Overview

Neural Network base class

Direct Known Subclasses

FeedForward, Recurrent

Instance Attribute Summary collapse

Class Method Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(struct, act_fn: nil) ⇒ Base

Returns a new instance of Base.

Parameters:

  • struct (Array<Integer>)

    list of layer sizes

  • act_fn (Symbol) (defaults to: nil)

    choice of activation function for the neurons



27
28
29
30
31
32
33
34
35
36
37
38
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 27

def initialize struct, act_fn: nil
  @struct = struct
  @act_fn = self.class.act_fn(act_fn || :sigmoid)
  # @state holds both inputs, possibly recurrency, and bias
  # it is a complete input for the next layer, hence size from layer sizes
  @state = layer_row_sizes.collect do |size|
    NMatrix.zeros([1, size], dtype: :float64)
  end
  # to this, append a matrix to hold the final network output
  @state.push NMatrix.zeros([1, nneurs(-1)], dtype: :float64)
  reset_state
end

Instance Attribute Details

#act_fn#call (readonly)

activation function, common to all neurons (for now)

Returns:

  • (#call)

    activation function



20
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 20

attr_reader :layers, :state, :act_fn, :struct

#layersArray<NMatrix> (readonly)

List of matrices, each being the weights connecting a layer’s inputs (rows) to a layer’s neurons (columns), hence its shape is ‘[ninputs, nneurs]`

Returns:

  • (Array<NMatrix>)

    list of weight matrices, each uniquely describing a layer



20
21
22
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 20

def layers
  @layers
end

#stateArray<NMatrix> (readonly)

It’s a list of one-dimensional matrices, each an input to a layer, plus the output layer’s output. The first element is the input to the first layer of the network, which is composed of the network’s input, possibly the first layer’s activation on the last input (recursion), and a bias (fixed ‘1`). The second to but-last entries follow the same structure, but with the previous layer’s output in place of the network’s input. The last entry is the activation of the output layer, without additions since it’s not used as an input by anyone.

Returns:

  • (Array<NMatrix>)

    current state of the network.



20
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 20

attr_reader :layers, :state, :act_fn, :struct

#structObject (readonly)

Returns the value of attribute struct.



20
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 20

attr_reader :layers, :state, :act_fn, :struct

Class Method Details

.act_fn(type, *args) ⇒ NMatrix

Activation function caller. Allows to cleanly define the activation function as one-dimensional, by calling it over the inputs and building a NMatrix to return.

Returns:

  • (NMatrix)

    activations for one layer



171
172
173
174
175
176
177
178
179
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 171

def self.act_fn type, *args
  fn = send(type,*args)
  lambda do |inputs|
    NMatrix.new([1, inputs.size], dtype: :float64) do |_,i|
      # single-row matrix, indices are columns
      fn.call inputs[i]
    end
  end
end

.lecun_hyperbolicObject

LeCun hyperbolic activation

See Also:



198
199
200
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 198

def self.lecun_hyperbolic
  lambda { |x| 1.7159 * Math.tanh(2.0*x/3.0) + 1e-3*x }
end

.logisticObject

Traditional logistic



189
190
191
192
193
194
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 189

def self.logistic
  lambda { |x|
    exp = Math.exp(x)
    exp.infinite? ? exp : exp / (1.0 + exp)
  }
end

.sigmoid(k = 0.5) ⇒ Object

Traditional sigmoid with variable steepness



182
183
184
185
186
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 182

def self.sigmoid k=0.5
  # k is steepness:  0<k<1 is flatter, 1<k is flatter
  # flatter makes activation less sensitive, better with large number of inputs
  lambda { |x| 1.0 / (Math.exp(-k * x) + 1.0) }
end

Instance Method Details

#activate(input) ⇒ Array

Activate the network on a given input

Parameters:

  • input (Array<Float>)

    the given input

Returns:

  • (Array)

    the activation of the output layer



146
147
148
149
150
151
152
153
154
155
156
157
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 146

def activate input
  raise "Hell!" unless input.size == struct.first
  raise "Hell!" unless input.is_a? Array
  # load input in first state
  @state[0][0, 0..-2] = input
  # activate layers in sequence
  (0...nlayers).each do |i|
    act = activate_layer i
    @state[i+1][0,0...act.size] = act
  end
  return out
end

#biasObject

The “fixed ‘1`” used in the layer’s input



139
140
141
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 139

def bias
  @bias ||= NMatrix[[1], dtype: :float64]
end

#deep_resetObject

Resets memoization: needed to play with structure modification



60
61
62
63
64
65
66
67
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 60

def deep_reset
  # reset memoization
  [:@layer_row_sizes, :@layer_col_sizes, :@nlayers, :@layer_shapes,
   :@nweights_per_layer, :@nweights].each do |sym|
     instance_variable_set sym, nil
  end
  reset_state
end

#init_randomObject

Initialize the network with random weights



51
52
53
54
55
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 51

def init_random
  # Will only be used for testing, no sense optimizing it (NMatrix#rand)
  # Reusing #load_weights instead helps catching bugs
  load_weights nweights.times.collect { rand(-1.0..1.0) }
end

#interface_methodsObject

Declaring interface methods - implement in child class!



205
206
207
208
209
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 205

[:layer_row_sizes, :activate_layer].each do |sym|
  define_method sym do
    raise NotImplementedError, "Implement ##{sym} in child class!"
  end
end

#layer_col_sizesArray

Number of neurons per layer. Although this implementation includes inputs in the layer counts, this methods correctly ignores the input as not having neurons.

Returns:

  • (Array)

    list of neurons per each (proper) layer (i.e. no inputs)



99
100
101
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 99

def layer_col_sizes
  @layer_col_sizes ||= struct.drop(1)
end

#layer_shapesArray<Array[Integer, Integer]>

Shapes for the weight matrices, each corresponding to a layer

Returns:

  • (Array<Array[Integer, Integer]>)

    Weight matrix shapes



107
108
109
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 107

def layer_shapes
  @layer_shapes ||= layer_row_sizes.zip layer_col_sizes
end

#load_weights(weights) ⇒ true

Loads a plain list of weights into the weight matrices (one per layer). Preserves order.

Returns:

  • (true)

    always true. If something’s wrong it simply fails, and if all goes well there’s nothing to return but a confirmation to the caller.



125
126
127
128
129
130
131
132
133
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 125

def load_weights weights
  raise "Hell!" unless weights.size == nweights
  weights_iter = weights.each
  @layers = layer_shapes.collect do |shape|
    NMatrix.new(shape, dtype: :float64) { weights_iter.next }
  end
  reset_state
  return true
end

#nlayersInteger

Count the layers. This is a computation helper, and for this implementation the inputs are considered as if a layer like the others.

Returns:

  • (Integer)

    number of layers



84
85
86
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 84

def nlayers
  @nlayers ||= layer_shapes.size
end

#nneurs(nlay = nil) ⇒ Integer

Count the neurons in a particular layer or in the whole network.

Parameters:

  • nlay (Integer, nil) (defaults to: nil)

    the layer of interest, 1-indexed. ‘0` will return the number of inputs. `nil` will compute the total neurons in the network.

Returns:

  • (Integer)

    the number of neurons in a given layer, or in all network, or the number of inputs



116
117
118
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 116

def nneurs nlay=nil
  nlay.nil? ? struct.reduce(:+) : struct[nlay]
end

#nweightsInteger

Total weights in the network

Returns:

  • (Integer)

    total number of weights



71
72
73
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 71

def nweights
  @nweights ||= nweights_per_layer.reduce(:+)
end

#nweights_per_layerArray<Integer>

List of per-layer number of weights

Returns:

  • (Array<Integer>)

    list of weights per each layer



77
78
79
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 77

def nweights_per_layer
  @nweights_per_layer ||= layer_shapes.collect { |shape| shape.reduce(:*) }
end

#outArray

Extract and convert the output layer’s activation

Returns:

  • (Array)

    the activation of the output layer as 1-dim Array



161
162
163
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 161

def out
  state.last.to_flat_a
end

#reset_stateObject

Reset the network to the initial state



41
42
43
44
45
46
47
48
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 41

def reset_state
  @state.each do |m| # state has only single-row matrices
    # reset all to zero
    m[0,0..-1] = 0
    # add bias to all but output
    m[0,-1] = 1 unless m.object_id == @state.last.object_id
  end
end

#weightsArray

Returns the weight matrix

Returns:

  • (Array)

    three-dimensional Array of weights: a list of weight matrices, one for each layer.



91
92
93
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 91

def weights
  layers.collect(&:to_consistent_a)
end