Class: Ai4r::NeuralNetwork::Backpropagation

Inherits:
Object
  • Object
show all
Includes:
Data::Parameterizable
Defined in:
lib/ai4r/neural_network/backpropagation.rb

Overview

Introduction

This is an implementation of a multilayer perceptron network, using the backpropagation algorithm for learning.

Backpropagation is a supervised learning technique (described by Paul Werbos in 1974, and further developed by David E. Rumelhart, Geoffrey E. Hinton and Ronald J. Williams in 1986)

Features

  • Support for any network architecture (number of layers and neurons)

  • Configurable propagation function

  • Optional usage of bias

  • Configurable momentum

  • Configurable learning rate

  • Configurable initial weight function

  • 100% ruby code, no external dependency

Parameters

Use class method get_parameters_info to obtain details on the algorithm parameters. Use set_parameters to set values for this parameters.

  • :disable_bias => If true, the algorithm will not use bias nodes. False by default.

  • :initial_weight_function => f(n, i, j) must return the initial weight for the conection between the node i in layer n, and node j in layer n+1. By default a random number in [-1, 1) range.

  • :propagation_function => By default: lambda { |x| 1/(1+Math.exp(-1*(x))) }

  • :derivative_propagation_function => Derivative of the propagation function, based on propagation function output. By default: lambda { |y| y*(1-y) }, where y=propagation_function(x)

  • :learning_rate => By default 0.25

  • :momentum => By default 0.1. Set this parameter to 0 to disable momentum

How to use it

# Create the network with 4 inputs, 1 hidden layer with 3 neurons,
# and 2 outputs
net = Ai4r::NeuralNetwork::Backpropagation.new([4, 3, 2])  

# Train the network 
1000.times do |i|
  net.train(example[i], result[i])
end

# Use it: Evaluate data with the trained network
net.eval([12, 48, 12, 25])  
  =>  [0.86, 0.01]

More about multilayer perceptron neural networks and backpropagation:

About the project

Author

Sergio Fierens

License

MPL 1.1

Url

ai4r.org

Instance Attribute Summary collapse

Instance Method Summary collapse

Methods included from Data::Parameterizable

#get_parameters, included, #set_parameters

Constructor Details

#initialize(network_structure) ⇒ Backpropagation

Creates a new network specifying the its architecture. E.g.

net = Backpropagation.new([4, 3, 2])  # 4 inputs
                                      # 1 hidden layer with 3 neurons, 
                                      # 2 outputs    
net = Backpropagation.new([2, 3, 3, 4])   # 2 inputs
                                          # 2 hidden layer with 3 neurons each, 
                                          # 4 outputs    
net = Backpropagation.new([2, 1])   # 2 inputs
                                    # No hidden layer
                                    # 1 output

117
118
119
120
121
122
123
124
125
# File 'lib/ai4r/neural_network/backpropagation.rb', line 117

def initialize(network_structure)
  @structure = network_structure
  @initial_weight_function = lambda { |n, i, j| ((rand 2000)/1000.0) - 1}
  @propagation_function = lambda { |x| 1/(1+Math.exp(-1*(x))) } #lambda { |x| Math.tanh(x) }
  @derivative_propagation_function = lambda { |y| y*(1-y) } #lambda { |y| 1.0 - y**2 }
  @disable_bias = false
  @learning_rate = 0.25
  @momentum = 0.1
end

Instance Attribute Details

#activation_nodesObject

Returns the value of attribute activation_nodes


103
104
105
# File 'lib/ai4r/neural_network/backpropagation.rb', line 103

def activation_nodes
  @activation_nodes
end

#last_changesObject

Returns the value of attribute last_changes


103
104
105
# File 'lib/ai4r/neural_network/backpropagation.rb', line 103

def last_changes
  @last_changes
end

#structureObject

Returns the value of attribute structure


103
104
105
# File 'lib/ai4r/neural_network/backpropagation.rb', line 103

def structure
  @structure
end

#weightsObject

Returns the value of attribute weights


103
104
105
# File 'lib/ai4r/neural_network/backpropagation.rb', line 103

def weights
  @weights
end

Instance Method Details

#eval(input_values) ⇒ Object

Evaluates the input. E.g.

net = Backpropagation.new([4, 3, 2])
net.eval([25, 32.3, 12.8, 1.5])
    # =>  [0.83, 0.03]

132
133
134
135
136
137
# File 'lib/ai4r/neural_network/backpropagation.rb', line 132

def eval(input_values)
  check_input_dimension(input_values.length)
  init_network if !@weights
  feedforward(input_values)
  return @activation_nodes.last.clone
end

#eval_result(input_values) ⇒ Object

Evaluates the input and returns most active node E.g.

net = Backpropagation.new([4, 3, 2])
net.eval_result([25, 32.3, 12.8, 1.5])
    # eval gives [0.83, 0.03]
    # =>  0

145
146
147
148
# File 'lib/ai4r/neural_network/backpropagation.rb', line 145

def eval_result(input_values)
  result = eval(input_values)
  result.index(result.max)
end

#init_networkObject

Initialize (or reset) activation nodes and weights, with the provided net structure and parameters.


166
167
168
169
170
171
# File 'lib/ai4r/neural_network/backpropagation.rb', line 166

def init_network
  init_activation_nodes
  init_weights
  init_last_changes
  return self
end

#train(inputs, outputs) ⇒ Object

This method trains the network using the backpropagation algorithm.

input: Networks input

output: Expected output for the given input.

This method returns the network error:

> 0.5 * sum( (expected_value - output_value)**2 )


158
159
160
161
162
# File 'lib/ai4r/neural_network/backpropagation.rb', line 158

def train(inputs, outputs)
  eval(inputs)
  backpropagate(outputs)
  calculate_error(outputs)
end