Class: MachineLearningWorkbench::NeuralNetwork::Base
- Inherits:
-
Object
- Object
- MachineLearningWorkbench::NeuralNetwork::Base
- Defined in:
- lib/machine_learning_workbench/neural_network/base.rb
Overview
Neural Network base class
Direct Known Subclasses
Instance Attribute Summary collapse
-
#act_fn ⇒ #call
readonly
activation function, common to all neurons (for now).
-
#layers ⇒ Array<NMatrix>
readonly
List of matrices, each being the weights connecting a layer’s inputs (rows) to a layer’s neurons (columns), hence its shape is ‘[ninputs, nneurs]`.
-
#state ⇒ Array<NMatrix>
readonly
It’s a list of one-dimensional matrices, each an input to a layer, plus the output layer’s output.
-
#struct ⇒ Object
readonly
Returns the value of attribute struct.
Class Method Summary collapse
-
.act_fn(type, *args) ⇒ NMatrix
Activation function caller.
-
.lecun_hyperbolic ⇒ Object
LeCun hyperbolic activation.
-
.logistic ⇒ Object
Traditional logistic.
-
.sigmoid(k = 0.5) ⇒ Object
Traditional sigmoid with variable steepness.
Instance Method Summary collapse
-
#activate(input) ⇒ Array
Activate the network on a given input.
-
#bias ⇒ Object
The “fixed ‘1`” used in the layer’s input.
-
#deep_reset ⇒ Object
Resets memoization: needed to play with structure modification.
-
#init_random ⇒ Object
Initialize the network with random weights.
-
#initialize(struct, act_fn: nil) ⇒ Base
constructor
A new instance of Base.
-
#interface_methods ⇒ Object
Declaring interface methods - implement in child class!.
-
#layer_col_sizes ⇒ Array
Number of neurons per layer.
-
#layer_shapes ⇒ Array<Array[Integer, Integer]>
Shapes for the weight matrices, each corresponding to a layer.
-
#load_weights(weights) ⇒ true
Loads a plain list of weights into the weight matrices (one per layer).
-
#nlayers ⇒ Integer
Count the layers.
-
#nneurs(nlay = nil) ⇒ Integer
Count the neurons in a particular layer or in the whole network.
-
#nweights ⇒ Integer
Total weights in the network.
-
#nweights_per_layer ⇒ Array<Integer>
List of per-layer number of weights.
-
#out ⇒ Array
Extract and convert the output layer’s activation.
-
#reset_state ⇒ Object
Reset the network to the initial state.
-
#weights ⇒ Array
Returns the weight matrix.
Constructor Details
#initialize(struct, act_fn: nil) ⇒ Base
Returns a new instance of Base.
27 28 29 30 31 32 33 34 35 36 37 38 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 27 def initialize struct, act_fn: nil @struct = struct @act_fn = self.class.act_fn(act_fn || :sigmoid) # @state holds both inputs, possibly recurrency, and bias # it is a complete input for the next layer, hence size from layer sizes @state = layer_row_sizes.collect do |size| NMatrix.zeros([1, size], dtype: :float64) end # to this, append a matrix to hold the final network output @state.push NMatrix.zeros([1, nneurs(-1)], dtype: :float64) reset_state end |
Instance Attribute Details
#act_fn ⇒ #call (readonly)
activation function, common to all neurons (for now)
20 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 20 attr_reader :layers, :state, :act_fn, :struct |
#layers ⇒ Array<NMatrix> (readonly)
List of matrices, each being the weights connecting a layer’s inputs (rows) to a layer’s neurons (columns), hence its shape is ‘[ninputs, nneurs]`
20 21 22 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 20 def layers @layers end |
#state ⇒ Array<NMatrix> (readonly)
It’s a list of one-dimensional matrices, each an input to a layer, plus the output layer’s output. The first element is the input to the first layer of the network, which is composed of the network’s input, possibly the first layer’s activation on the last input (recursion), and a bias (fixed ‘1`). The second to but-last entries follow the same structure, but with the previous layer’s output in place of the network’s input. The last entry is the activation of the output layer, without additions since it’s not used as an input by anyone.
20 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 20 attr_reader :layers, :state, :act_fn, :struct |
#struct ⇒ Object (readonly)
Returns the value of attribute struct.
20 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 20 attr_reader :layers, :state, :act_fn, :struct |
Class Method Details
.act_fn(type, *args) ⇒ NMatrix
Activation function caller. Allows to cleanly define the activation function as one-dimensional, by calling it over the inputs and building a NMatrix to return.
171 172 173 174 175 176 177 178 179 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 171 def self.act_fn type, *args fn = send(type,*args) lambda do |inputs| NMatrix.new([1, inputs.size], dtype: :float64) do |_,i| # single-row matrix, indices are columns fn.call inputs[i] end end end |
.lecun_hyperbolic ⇒ Object
LeCun hyperbolic activation
198 199 200 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 198 def self.lecun_hyperbolic lambda { |x| 1.7159 * Math.tanh(2.0*x/3.0) + 1e-3*x } end |
.logistic ⇒ Object
Traditional logistic
189 190 191 192 193 194 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 189 def self.logistic lambda { |x| exp = Math.exp(x) exp.infinite? ? exp : exp / (1.0 + exp) } end |
.sigmoid(k = 0.5) ⇒ Object
Traditional sigmoid with variable steepness
182 183 184 185 186 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 182 def self.sigmoid k=0.5 # k is steepness: 0<k<1 is flatter, 1<k is flatter # flatter makes activation less sensitive, better with large number of inputs lambda { |x| 1.0 / (Math.exp(-k * x) + 1.0) } end |
Instance Method Details
#activate(input) ⇒ Array
Activate the network on a given input
146 147 148 149 150 151 152 153 154 155 156 157 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 146 def activate input raise "Hell!" unless input.size == struct.first raise "Hell!" unless input.is_a? Array # load input in first state @state[0][0, 0..-2] = input # activate layers in sequence (0...nlayers).each do |i| act = activate_layer i @state[i+1][0,0...act.size] = act end return out end |
#bias ⇒ Object
The “fixed ‘1`” used in the layer’s input
139 140 141 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 139 def bias @bias ||= NMatrix[[1], dtype: :float64] end |
#deep_reset ⇒ Object
Resets memoization: needed to play with structure modification
60 61 62 63 64 65 66 67 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 60 def deep_reset # reset memoization [:@layer_row_sizes, :@layer_col_sizes, :@nlayers, :@layer_shapes, :@nweights_per_layer, :@nweights].each do |sym| instance_variable_set sym, nil end reset_state end |
#init_random ⇒ Object
Initialize the network with random weights
51 52 53 54 55 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 51 def init_random # Will only be used for testing, no sense optimizing it (NMatrix#rand) # Reusing #load_weights instead helps catching bugs load_weights nweights.times.collect { rand(-1.0..1.0) } end |
#interface_methods ⇒ Object
Declaring interface methods - implement in child class!
205 206 207 208 209 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 205 [:layer_row_sizes, :activate_layer].each do |sym| define_method sym do raise NotImplementedError, "Implement ##{sym} in child class!" end end |
#layer_col_sizes ⇒ Array
Number of neurons per layer. Although this implementation includes inputs in the layer counts, this methods correctly ignores the input as not having neurons.
99 100 101 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 99 def layer_col_sizes @layer_col_sizes ||= struct.drop(1) end |
#layer_shapes ⇒ Array<Array[Integer, Integer]>
Shapes for the weight matrices, each corresponding to a layer
107 108 109 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 107 def layer_shapes @layer_shapes ||= layer_row_sizes.zip layer_col_sizes end |
#load_weights(weights) ⇒ true
Loads a plain list of weights into the weight matrices (one per layer). Preserves order.
125 126 127 128 129 130 131 132 133 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 125 def load_weights weights raise "Hell!" unless weights.size == nweights weights_iter = weights.each @layers = layer_shapes.collect do |shape| NMatrix.new(shape, dtype: :float64) { weights_iter.next } end reset_state return true end |
#nlayers ⇒ Integer
Count the layers. This is a computation helper, and for this implementation the inputs are considered as if a layer like the others.
84 85 86 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 84 def nlayers @nlayers ||= layer_shapes.size end |
#nneurs(nlay = nil) ⇒ Integer
Count the neurons in a particular layer or in the whole network.
116 117 118 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 116 def nneurs nlay=nil nlay.nil? ? struct.reduce(:+) : struct[nlay] end |
#nweights ⇒ Integer
Total weights in the network
71 72 73 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 71 def nweights @nweights ||= nweights_per_layer.reduce(:+) end |
#nweights_per_layer ⇒ Array<Integer>
List of per-layer number of weights
77 78 79 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 77 def nweights_per_layer @nweights_per_layer ||= layer_shapes.collect { |shape| shape.reduce(:*) } end |
#out ⇒ Array
Extract and convert the output layer’s activation
161 162 163 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 161 def out state.last.to_flat_a end |
#reset_state ⇒ Object
Reset the network to the initial state
41 42 43 44 45 46 47 48 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 41 def reset_state @state.each do |m| # state has only single-row matrices # reset all to zero m[0,0..-1] = 0 # add bias to all but output m[0,-1] = 1 unless m.object_id == @state.last.object_id end end |
#weights ⇒ Array
Returns the weight matrix
91 92 93 |
# File 'lib/machine_learning_workbench/neural_network/base.rb', line 91 def weights layers.collect(&:to_consistent_a) end |