Class: Newral::Functions::Base
- Inherits:
-
Object
- Object
- Newral::Functions::Base
- Defined in:
- lib/newral/functions/base.rb
Overview
the idea of having function classes is to have a common interface when the functions are optimized in training algorithms
Direct Known Subclasses
Block, Gaussian, Line, Polynomial, RadialBasisFunctionNetwork, Vector
Instance Attribute Summary collapse
-
#center ⇒ Object
Returns the value of attribute center.
Class Method Summary collapse
-
.create_random(low_range: -9,, high_range: 9) ⇒ Object
if a function implements calculate,move+number_of_directions we can use it for all training algorithms (like hill climbing) as shown in the Networks / network you do not need to derive from base function.
Instance Method Summary collapse
- #calculate ⇒ Object
-
#calculate_descent(input, difference: nil) ⇒ Object
approximates the descent calculation at a certain point.
- #calculate_error(input: [], output: []) ⇒ Object
- #calculate_for_center_distance(vector1) ⇒ Object
- #error_gradient_approximation(direction: nil, step: 0.01, input: nil, output: nil) ⇒ Object
-
#find_minimum(start_input, max_iterations: 1000, treshold: 10**-9 ,, learning_rate: 0.01) ⇒ Object
finds the (local) minimum descent of a function.
- #move(direction: 0, step: 0.01, step_percentage: nil) ⇒ Object
-
#move_random(low_range: -0.9,, high_range: 0.9) ⇒ Object
moves all directions randomly.
- #move_several(directions: [], step: 0.01, step_percentage: nil) ⇒ Object
-
#move_with_gradient(input: [], output: [], learning_rate: 0.01, step: 0.01) ⇒ Object
for general functions we can only estimate the gradient of the error by taking small steps.
- #number_of_directions ⇒ Object
Instance Attribute Details
#center ⇒ Object
Returns the value of attribute center.
11 12 13 |
# File 'lib/newral/functions/base.rb', line 11 def center @center end |
Class Method Details
.create_random(low_range: -9,, high_range: 9) ⇒ Object
if a function implements calculate,move+number_of_directions we can use it for all training algorithms (like hill climbing) as shown in the Networks / network you do not need to derive from base function
47 48 49 |
# File 'lib/newral/functions/base.rb', line 47 def self.create_random( low_range: -9, high_range: 9 ) raise Errors::NotImplemented end |
Instance Method Details
#calculate ⇒ Object
13 14 15 |
# File 'lib/newral/functions/base.rb', line 13 def calculate raise NotImplemented end |
#calculate_descent(input, difference: nil) ⇒ Object
approximates the descent calculation at a certain point
18 19 20 21 |
# File 'lib/newral/functions/base.rb', line 18 def calculate_descent( input, difference: nil ) difference = (input/10000.0).abs unless difference (calculate( input+difference )- calculate( input ))/difference end |
#calculate_error(input: [], output: []) ⇒ Object
76 77 78 79 80 81 82 83 84 |
# File 'lib/newral/functions/base.rb', line 76 def calculate_error( input: [],output: [] ) expected_values = [] # output can be longer than input calculated_values = [] input.each_with_index do |x,idx| calculated_values << calculate( x ) expected_values << output[idx] end Newral::ErrorCalculation.root_mean_square( calculated_values, expected_values ) end |
#calculate_for_center_distance(vector1) ⇒ Object
39 40 41 |
# File 'lib/newral/functions/base.rb', line 39 def calculate_for_center_distance( vector1 ) calculate Newral::Tools.euclidian_distance( vector1, @center ) end |
#error_gradient_approximation(direction: nil, step: 0.01, input: nil, output: nil) ⇒ Object
86 87 88 89 90 91 |
# File 'lib/newral/functions/base.rb', line 86 def error_gradient_approximation( direction: nil, step: 0.01, input: nil, output: nil ) current_error = calculate_error( input: input, output: output) new_pos = self.dup.move( direction: direction, step: step ) new_error = new_pos.calculate_error( input: input, output: output) (new_error-current_error)/step end |
#find_minimum(start_input, max_iterations: 1000, treshold: 10**-9 ,, learning_rate: 0.01) ⇒ Object
finds the (local) minimum descent of a function
24 25 26 27 28 29 30 31 32 33 34 35 36 37 |
# File 'lib/newral/functions/base.rb', line 24 def find_minimum( start_input, max_iterations: 1000, treshold: 10**-9 , learning_rate: 0.01 ) descent = calculate_descent( start_input ) iterations = 0 input = start_input while descent.abs > treshold && iterations < max_iterations old_input = input input = input-descent.to_f*learning_rate new_descent = calculate_descent( input ) learning_rate = learning_rate.to_f/10 if new_descent*descent < 0 # slow down if descent changes descent = new_descent iterations = iterations+1 end { input: input, descent: descent, learning_rate: learning_rate, output: calculate( input ), iterations: iterations } end |
#move(direction: 0, step: 0.01, step_percentage: nil) ⇒ Object
56 57 58 |
# File 'lib/newral/functions/base.rb', line 56 def move( direction: 0, step:0.01, step_percentage: nil ) raise Errors::NotImplemented end |
#move_random(low_range: -0.9,, high_range: 0.9) ⇒ Object
moves all directions randomly
68 69 70 71 72 73 74 |
# File 'lib/newral/functions/base.rb', line 68 def move_random( low_range: -0.9, high_range: 0.9 ) number_of_directions.times do |direction| step = low_range+rand()*(high_range.to_f-low_range.to_f) move( direction: direction, step: step ) end self end |
#move_several(directions: [], step: 0.01, step_percentage: nil) ⇒ Object
60 61 62 63 64 65 |
# File 'lib/newral/functions/base.rb', line 60 def move_several( directions:[], step:0.01, step_percentage: nil ) directions.each do |direction| move( direction: direction, step: step, step_percentage: step_percentage) end self end |
#move_with_gradient(input: [], output: [], learning_rate: 0.01, step: 0.01) ⇒ Object
for general functions we can only estimate the gradient of the error by taking small steps
95 96 97 98 99 100 101 |
# File 'lib/newral/functions/base.rb', line 95 def move_with_gradient( input:[], output:[], learning_rate: 0.01, step: 0.01 ) number_of_directions.times do |direction| error_gradient = error_gradient_approximation( direction: direction, step: step, input: input, output: output ) move( direction: direction, step:(-error_gradient*learning_rate)) end self end |
#number_of_directions ⇒ Object
51 52 53 |
# File 'lib/newral/functions/base.rb', line 51 def number_of_directions raise Errors::NotImplemented end |