Class: DNN::Layers::Layer
- Inherits:
-
Object
- Object
- DNN::Layers::Layer
- Defined in:
- lib/dnn/core/layers.rb
Overview
Super class of all layer classes.
Direct Known Subclasses
Activations::ELU, Activations::LeakyReLU, Activations::ReLU, Activations::Sigmoid, Activations::Softplus, Activations::Softsign, Activations::Swish, Activations::Tanh, Dropout, Flatten, HasParamLayer, InputLayer, Pool2D, Reshape, UnPool2D, MergeLayers::MergeLayer
Instance Attribute Summary collapse
-
#input_shape ⇒ Object
readonly
Returns the value of attribute input_shape.
-
#name ⇒ Object
Returns the value of attribute name.
Class Method Summary collapse
Instance Method Summary collapse
-
#backward(dy) ⇒ Object
Backward propagation.
-
#build(input_shape) ⇒ Object
Build the layer.
-
#built? ⇒ Boolean
If layer have already been built then return true.
-
#call(input) ⇒ Object
Forward propagation and create a link.
-
#forward(x) ⇒ Object
Forward propagation.
-
#initialize ⇒ Layer
constructor
A new instance of Layer.
- #load_hash(hash) ⇒ Object
-
#output_shape ⇒ Array
Please reimplement this method as needed.
-
#to_hash(merge_hash = nil) ⇒ Object
Layer to a hash.
Constructor Details
#initialize ⇒ Layer
Returns a new instance of Layer.
22 23 24 25 |
# File 'lib/dnn/core/layers.rb', line 22 def initialize @built = false @name = nil end |
Instance Attribute Details
#input_shape ⇒ Object (readonly)
Returns the value of attribute input_shape.
7 8 9 |
# File 'lib/dnn/core/layers.rb', line 7 def input_shape @input_shape end |
#name ⇒ Object
Returns the value of attribute name.
6 7 8 |
# File 'lib/dnn/core/layers.rb', line 6 def name @name end |
Class Method Details
.call(x, *args) ⇒ Object
9 10 11 |
# File 'lib/dnn/core/layers.rb', line 9 def self.call(x, *args) self.new(*args).(x) end |
.from_hash(hash) ⇒ Object
13 14 15 16 17 18 19 20 |
# File 'lib/dnn/core/layers.rb', line 13 def self.from_hash(hash) return nil unless hash layer_class = DNN.const_get(hash[:class]) layer = layer_class.allocate raise DNN_Error.new("#{layer.class} is not an instance of #{self} class.") unless layer.is_a?(self) layer.load_hash(hash) layer end |
Instance Method Details
#backward(dy) ⇒ Object
Backward propagation.
57 58 59 |
# File 'lib/dnn/core/layers.rb', line 57 def backward(dy) raise NotImplementedError.new("Class '#{self.class.name}' has implement method 'backward'") end |
#build(input_shape) ⇒ Object
Build the layer.
39 40 41 42 |
# File 'lib/dnn/core/layers.rb', line 39 def build(input_shape) @input_shape = input_shape @built = true end |
#built? ⇒ Boolean
Returns If layer have already been built then return true.
45 46 47 |
# File 'lib/dnn/core/layers.rb', line 45 def built? @built end |
#call(input) ⇒ Object
Forward propagation and create a link.
29 30 31 32 33 34 35 |
# File 'lib/dnn/core/layers.rb', line 29 def call(input) x, prev_link = *input build(x.shape[1..-1]) unless built? y = forward(x) link = Link.new(prev_link, self) [y, link] end |
#forward(x) ⇒ Object
Forward propagation.
51 52 53 |
# File 'lib/dnn/core/layers.rb', line 51 def forward(x) raise NotImplementedError.new("Class '#{self.class.name}' has implement method 'forward'") end |
#load_hash(hash) ⇒ Object
75 76 77 |
# File 'lib/dnn/core/layers.rb', line 75 def load_hash(hash) initialize end |
#output_shape ⇒ Array
Please reimplement this method as needed. The default implementation return input_shape.
64 65 66 |
# File 'lib/dnn/core/layers.rb', line 64 def output_shape @input_shape end |
#to_hash(merge_hash = nil) ⇒ Object
Layer to a hash.
69 70 71 72 73 |
# File 'lib/dnn/core/layers.rb', line 69 def to_hash(merge_hash = nil) hash = { class: self.class.name, name: @name } hash.merge!(merge_hash) if merge_hash hash end |