Class: DNN::Layers::Layer
- Inherits:
-
Object
- Object
- DNN::Layers::Layer
- Defined in:
- lib/dnn/core/layers.rb
Overview
Super class of all layer classes.
Direct Known Subclasses
Dropout, ELU, Flatten, HasParamLayer, InputLayer, LeakyReLU, Pool2D, ReLU, Reshape, Sigmoid, Softplus, Softsign, Swish, Tanh, UnPool2D, MergeLayers::MergeLayer
Instance Attribute Summary collapse
-
#input_shape ⇒ Object
readonly
Returns the value of attribute input_shape.
-
#name ⇒ Object
Returns the value of attribute name.
Class Method Summary collapse
Instance Method Summary collapse
-
#backward(dy) ⇒ Object
Backward propagation.
-
#build(input_shape) ⇒ Object
Build the layer.
-
#built? ⇒ Boolean
If layer have already been built then return true.
-
#call(input_tensor) ⇒ Tensor
Forward propagation and create a link.
-
#forward(x) ⇒ Object
Forward propagation.
-
#initialize ⇒ Layer
constructor
A new instance of Layer.
- #load_hash(hash) ⇒ Object
-
#output_shape ⇒ Array
Please reimplement this method as needed.
-
#to_hash(merge_hash = nil) ⇒ Object
Layer to a hash.
Constructor Details
#initialize ⇒ Layer
Returns a new instance of Layer.
23 24 25 26 |
# File 'lib/dnn/core/layers.rb', line 23 def initialize @built = false @name = nil end |
Instance Attribute Details
#input_shape ⇒ Object (readonly)
Returns the value of attribute input_shape.
7 8 9 |
# File 'lib/dnn/core/layers.rb', line 7 def input_shape @input_shape end |
#name ⇒ Object
Returns the value of attribute name.
6 7 8 |
# File 'lib/dnn/core/layers.rb', line 6 def name @name end |
Class Method Details
.call(x, *args) ⇒ Object
9 10 11 |
# File 'lib/dnn/core/layers.rb', line 9 def self.call(x, *args) new(*args).(x) end |
.from_hash(hash) ⇒ Object
13 14 15 16 17 18 19 20 21 |
# File 'lib/dnn/core/layers.rb', line 13 def self.from_hash(hash) return nil unless hash layer_class = DNN.const_get(hash[:class]) layer = layer_class.allocate raise DNN_Error, "#{layer.class} is not an instance of #{self} class." unless layer.is_a?(self) layer.load_hash(hash) layer.name = hash[:name]&.to_sym layer end |
Instance Method Details
#backward(dy) ⇒ Object
Backward propagation.
60 61 62 |
# File 'lib/dnn/core/layers.rb', line 60 def backward(dy) raise NotImplementedError, "Class '#{self.class.name}' has implement method 'backward'" end |
#build(input_shape) ⇒ Object
Build the layer.
42 43 44 45 |
# File 'lib/dnn/core/layers.rb', line 42 def build(input_shape) @input_shape = input_shape @built = true end |
#built? ⇒ Boolean
Returns If layer have already been built then return true.
48 49 50 |
# File 'lib/dnn/core/layers.rb', line 48 def built? @built end |
#call(input_tensor) ⇒ Tensor
Forward propagation and create a link.
31 32 33 34 35 36 37 38 |
# File 'lib/dnn/core/layers.rb', line 31 def call(input_tensor) x = input_tensor.data prev_link = input_tensor.link build(x.shape[1..-1]) unless built? y = forward(x) link = Link.new(prev_link, self) Tensor.new(y, link) end |
#forward(x) ⇒ Object
Forward propagation.
54 55 56 |
# File 'lib/dnn/core/layers.rb', line 54 def forward(x) raise NotImplementedError, "Class '#{self.class.name}' has implement method 'forward'" end |
#load_hash(hash) ⇒ Object
78 79 80 |
# File 'lib/dnn/core/layers.rb', line 78 def load_hash(hash) initialize end |
#output_shape ⇒ Array
Please reimplement this method as needed. The default implementation return input_shape.
67 68 69 |
# File 'lib/dnn/core/layers.rb', line 67 def output_shape @input_shape end |
#to_hash(merge_hash = nil) ⇒ Object
Layer to a hash.
72 73 74 75 76 |
# File 'lib/dnn/core/layers.rb', line 72 def to_hash(merge_hash = nil) hash = { class: self.class.name, name: @name } hash.merge!(merge_hash) if merge_hash hash end |