Class: MachineLearningWorkbench::NeuralNetwork::Base

Inherits:
Object
  • Object
show all
Defined in:
lib/machine_learning_workbench/neural_network/base.rb

Overview

Neural Network base class

Direct Known Subclasses

FeedForward, Recurrent

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(struct, act_fn: nil, **act_fn_args) ⇒ Base

Returns a new instance of Base.

Parameters:

  • struct (Array<Integer>)

    list of layer sizes

  • act_fn (Symbol) (defaults to: nil)

    choice of activation function for the neurons



30
31
32
33
34
35
36
37
38
39
40
41
42
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 30

def initialize struct, act_fn: nil, **act_fn_args
  @struct = struct
  @act_fn_name = act_fn || :sigmoid
  @act_fn = send act_fn_name, **act_fn_args
  # @state holds both inputs, possibly recurrency, and bias
  # it is a complete input for the next layer, hence size from layer sizes
  @state = layer_row_sizes.collect do |size|
    NArray.zeros [1, size]
  end
  # to this, append a matrix to hold the final network output
  @state.push NArray.zeros [1, nneurs(-1)]
  reset_state
end

Instance Attribute Details

#act_fn#call (readonly)

activation function, common to all neurons (for now)

Returns:

  • (#call)

    activation function



23
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 23

attr_reader :layers, :state, :act_fn, :act_fn_name, :struct

#act_fn_nameObject (readonly)

Returns the value of attribute act_fn_name.



23
24
25
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 23

def act_fn_name
  @act_fn_name
end

#layersArray<NArray> (readonly)

List of matrices, each being the weights connecting a layer’s inputs (rows) to a layer’s neurons (columns), hence its shape is ‘[ninputs, nneurs]` TODO: return a NArray after the usage of `#map` is figured out

Returns:

  • (Array<NArray>)

    list of weight matrices, each uniquely describing a layer



23
24
25
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 23

def layers
  @layers
end

#stateArray<NArray> (readonly)

It’s a list of one-dimensional matrices, each an input to a layer, plus the output layer’s output. The first element is the input to the first layer of the network, which is composed of the network’s input, possibly the first layer’s activation on the last input (recursion), and a bias (fixed ‘1`). The second to but-last entries follow the same structure, but with the previous layer’s output in place of the network’s input. The last entry is the activation of the output layer, without additions since it’s not used as an input by anyone. TODO: return a NArray after the usage of ‘#map` is figured out

Returns:

  • (Array<NArray>)

    current state of the network.



23
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 23

attr_reader :layers, :state, :act_fn, :act_fn_name, :struct

#structObject (readonly)

Returns the value of attribute struct.



23
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 23

attr_reader :layers, :state, :act_fn, :act_fn_name, :struct

Instance Method Details

#activate(input) ⇒ Array

Activate the network on a given input

Parameters:

  • input (Array<Float>)

    the given input

Returns:

  • (Array)

    the activation of the output layer

Raises:

  • (ArgumentError)


147
148
149
150
151
152
153
154
155
156
157
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 147

def activate input
  raise ArgumentError unless input.size == struct.first
  # load input in first state
  state[0][0...struct.first] = input
  # activate layers in sequence
  nlayers.times.each do |i|
    act = activate_layer i
    state[i+1][0...act.size] = act
  end
  return out
end

#deep_resetObject

Resets memoization: needed to play with structure modification



63
64
65
66
67
68
69
70
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 63

def deep_reset
  # reset memoization
  [:@layer_row_sizes, :@layer_col_sizes, :@nlayers, :@layer_shapes,
   :@nweights_per_layer, :@nweights].each do |sym|
     instance_variable_set sym, nil
  end
  reset_state
end

#init_randomObject

Initialize the network with random weights



54
55
56
57
58
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 54

def init_random
  # Reusing `#load_weights` instead helps catching bugs
  deep_reset
  load_weights NArray.new(nweights).rand(-1,1)
end

#interface_methodsObject

Declaring interface methods - implement in child class!



189
190
191
192
193
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 189

[:layer_row_sizes, :activate_layer].each do |sym|
  define_method sym do
    raise NotImplementedError, "Implement ##{sym} in child class!"
  end
end

#layer_col_sizesArray

Number of neurons per layer. Although this implementation includes inputs in the layer counts, this methods correctly ignores the input as not having neurons.

Returns:

  • (Array)

    list of neurons per each (proper) layer (i.e. no inputs)



101
102
103
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 101

def layer_col_sizes
  @layer_col_sizes ||= struct.drop(1)
end

#layer_shapesArray<Array[Integer, Integer]>

Shapes for the weight matrices, each corresponding to a layer

Returns:

  • (Array<Array[Integer, Integer]>)

    Weight matrix shapes



109
110
111
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 109

def layer_shapes
  @layer_shapes ||= layer_row_sizes.zip layer_col_sizes
end

#lecun_hyperbolicObject

LeCun hyperbolic activation

See Also:



177
178
179
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 177

def lecun_hyperbolic
  -> (vec) { 1.7159 * NMath.tanh(2.0*vec/3.0) + 1e-3*vec }
end

#load_weights(weights) ⇒ true

Loads a plain list of weights into the weight matrices (one per layer). Preserves order. Reuses allocated memory if available.

Returns:

  • (true)

    always true. If something’s wrong it simply fails, and if all goes well there’s nothing to return but a confirmation to the caller.

Raises:

  • (ArgumentError)


127
128
129
130
131
132
133
134
135
136
137
138
139
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 127

def load_weights weights
  raise ArgumentError unless weights.size == nweights
  weights = weights.to_na unless weights.kind_of? NArray
  from = 0
  @layers = layer_shapes.collect do |shape|
    to = from + shape.reduce(:*)
    lay_w = weights[from...to].reshape *shape
    from = to
    lay_w
  end
  reset_state
  return true
end

#nlayersInteger

Count the layers. This is a computation helper, and for this implementation the inputs are considered as if a layer like the others.

Returns:

  • (Integer)

    number of layers



87
88
89
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 87

def nlayers
  @nlayers ||= layer_shapes.size
end

#nneurs(nlay = nil) ⇒ Integer

Count the neurons in a particular layer or in the whole network.

Parameters:

  • nlay (Integer, nil) (defaults to: nil)

    the layer of interest, 1-indexed. ‘0` will return the number of inputs. `nil` will compute the total neurons in the network.

Returns:

  • (Integer)

    the number of neurons in a given layer, or in all network, or the number of inputs



118
119
120
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 118

def nneurs nlay=nil
  nlay.nil? ? struct.reduce(:+) : struct[nlay]
end

#nweightsInteger

Total weights in the network

Returns:

  • (Integer)

    total number of weights



74
75
76
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 74

def nweights
  @nweights ||= nweights_per_layer.reduce(:+)
end

#nweights_per_layerArray<Integer>

List of per-layer number of weights

Returns:

  • (Array<Integer>)

    list of weights per each layer



80
81
82
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 80

def nweights_per_layer
  @nweights_per_layer ||= layer_shapes.collect { |shape| shape.reduce(:*) }
end

#outNArray

Extract and convert the output layer’s activation

Returns:

  • (NArray)

    the activation of the output layer



161
162
163
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 161

def out
  state.last.flatten
end

#reluObject

Rectified Linear Unit (ReLU)



182
183
184
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 182

def relu
  -> (vec) { (vec>0).all? && vec || vec.class.zeros(vec.shape) }
end

#reset_stateObject

Reset the network to the initial state



45
46
47
48
49
50
51
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 45

def reset_state
  state.each do |s|
    s.fill 0           # reset state to zero
    s[-1] = 1        # add bias
  end
  state[-1][-1] = 0  # last layer has no bias
end

#sigmoid(steepness: 1) ⇒ Object Also known as: logistic

Traditional sigmoid (logistic) with variable steepness



168
169
170
171
172
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 168

def sigmoid steepness: 1
  # steepness:  0<s<1 is flatter, 1<s is flatter
  # flatter makes activation less sensitive, better with large number of inputs
  -> (vec) { 1.0 / (NMath.exp(-steepness * vec) + 1.0) }
end

#weightsArray<NArray>

Returns the weight matrix

Returns:

  • (Array<NArray>)

    list of NArray matrices of weights (one per layer).



93
94
95
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 93

def weights
  layers
end