Class: DNN::Layers::Layer
- Inherits:
-
Object
- Object
- DNN::Layers::Layer
- Defined in:
- lib/dnn/core/layers/basic_layers.rb
Overview
Super class of all layer classes.
Direct Known Subclasses
Dropout, ELU, Flatten, GlobalAvgPool2D, InputLayer, LeakyReLU, MergeLayer, Mish, Pool2D, ReLU, Reshape, Sigmoid, Softplus, Softsign, Swish, Tanh, TrainableLayer, UnPool2D
Instance Attribute Summary collapse
-
#input_shape ⇒ Object
readonly
Returns the value of attribute input_shape.
Class Method Summary collapse
Instance Method Summary collapse
-
#backward(dy) ⇒ Object
Backward propagation.
-
#build(input_shape) ⇒ Object
Build the layer.
-
#built? ⇒ Boolean
If layer have already been built then return true.
-
#call(input_tensor) ⇒ Tensor
Forward propagation and create a link.
- #clean ⇒ Object
-
#forward(x) ⇒ Object
Forward propagation.
-
#initialize ⇒ Layer
constructor
A new instance of Layer.
- #load_hash(hash) ⇒ Object
-
#output_shape ⇒ Array
Please reimplement this method as needed.
-
#to_hash(merge_hash = nil) ⇒ Object
Layer to a hash.
Constructor Details
#initialize ⇒ Layer
Returns a new instance of Layer.
21 22 23 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 21 def initialize @built = false end |
Instance Attribute Details
#input_shape ⇒ Object (readonly)
Returns the value of attribute input_shape.
6 7 8 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 6 def input_shape @input_shape end |
Class Method Details
.call(x, *args) ⇒ Object
8 9 10 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 8 def self.call(x, *args) new(*args).(x) end |
.from_hash(hash) ⇒ Object
12 13 14 15 16 17 18 19 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 12 def self.from_hash(hash) return nil unless hash layer_class = DNN.const_get(hash[:class]) layer = layer_class.allocate raise DNN_Error, "#{layer.class} is not an instance of #{self} class." unless layer.is_a?(self) layer.load_hash(hash) layer end |
Instance Method Details
#backward(dy) ⇒ Object
Backward propagation.
57 58 59 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 57 def backward(dy) raise NotImplementedError, "Class '#{self.class.name}' has implement method 'backward'" end |
#build(input_shape) ⇒ Object
Build the layer.
39 40 41 42 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 39 def build(input_shape) @input_shape = input_shape @built = true end |
#built? ⇒ Boolean
Returns If layer have already been built then return true.
45 46 47 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 45 def built? @built end |
#call(input_tensor) ⇒ Tensor
Forward propagation and create a link.
28 29 30 31 32 33 34 35 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 28 def call(input_tensor) x = input_tensor.data prev_link = input_tensor.link build(x.shape[1..-1]) unless built? y = forward(x) link = Link.new(prev_link, self) Tensor.new(y, link) end |
#clean ⇒ Object
79 80 81 82 83 84 85 86 87 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 79 def clean input_shape = @input_shape hash = to_hash instance_variables.each do |ivar| instance_variable_set(ivar, nil) end load_hash(hash) build(input_shape) end |
#forward(x) ⇒ Object
Forward propagation.
51 52 53 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 51 def forward(x) raise NotImplementedError, "Class '#{self.class.name}' has implement method 'forward'" end |
#load_hash(hash) ⇒ Object
75 76 77 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 75 def load_hash(hash) initialize end |
#output_shape ⇒ Array
Please reimplement this method as needed. The default implementation return input_shape.
64 65 66 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 64 def output_shape @input_shape end |
#to_hash(merge_hash = nil) ⇒ Object
Layer to a hash.
69 70 71 72 73 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 69 def to_hash(merge_hash = nil) hash = { class: self.class.name } hash.merge!(merge_hash) if merge_hash hash end |