Class: DNN::Layers::Layer
- Inherits:
-
Object
- Object
- DNN::Layers::Layer
- Defined in:
- lib/dnn/core/layers/basic_layers.rb
Overview
Super class of all layer classes.
Direct Known Subclasses
Dropout, ELU, Exp, Flatten, GRUDense, GlobalAvgPool2D, InputLayer, LSTMDense, Lasso, LeakyReLU, Log, Mean, MergeLayer, Mish, Pool2D, Pow, ReLU, Reshape, Ridge, Sigmoid, SimpleRNNDense, Softplus, Softsign, Sqrt, Sum, Swish, Tanh, TrainableLayer, UnPool2D
Instance Attribute Summary collapse
-
#input_shape ⇒ Object
readonly
Returns the value of attribute input_shape.
Class Method Summary collapse
Instance Method Summary collapse
-
#build(input_shape) ⇒ Object
Build the layer.
-
#built? ⇒ Boolean
If layer have already been built then return true.
-
#call(input_tensor) ⇒ Tensor
Forward propagation and create a link.
-
#clean ⇒ Object
Clean the layer state.
-
#forward(input_tensor) ⇒ Tensor
Forward propagation.
-
#initialize ⇒ Layer
constructor
A new instance of Layer.
- #load_hash(hash) ⇒ Object
-
#output_shape ⇒ Array
Please reimplement this method as needed.
-
#to_hash(merge_hash = nil) ⇒ Object
Layer to a hash.
Constructor Details
#initialize ⇒ Layer
Returns a new instance of Layer.
43 44 45 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 43 def initialize @built = false end |
Instance Attribute Details
#input_shape ⇒ Object (readonly)
Returns the value of attribute input_shape.
28 29 30 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 28 def input_shape @input_shape end |
Class Method Details
.call(x, *args) ⇒ Object
30 31 32 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 30 def self.call(x, *args) new(*args).(x) end |
.from_hash(hash) ⇒ Object
34 35 36 37 38 39 40 41 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 34 def self.from_hash(hash) return nil unless hash layer_class = DNN.const_get(hash[:class]) layer = layer_class.allocate raise DNN_Error, "#{layer.class} is not an instance of #{self} class." unless layer.is_a?(self) layer.load_hash(hash) layer end |
Instance Method Details
#build(input_shape) ⇒ Object
Build the layer.
58 59 60 61 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 58 def build(input_shape) @input_shape = input_shape @built = true end |
#built? ⇒ Boolean
Returns If layer have already been built then return true.
64 65 66 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 64 def built? @built end |
#call(input_tensor) ⇒ Tensor
Forward propagation and create a link.
50 51 52 53 54 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 50 def call(input_tensor) input_tensor = Tensor.new(input_tensor) if !input_tensor.is_a?(Tensor) && !input_tensor.is_a?(Param) build(input_tensor.data.shape[1..-1]) unless built? forward(input_tensor) end |
#clean ⇒ Object
Clean the layer state.
94 95 96 97 98 99 100 101 102 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 94 def clean input_shape = @input_shape hash = to_hash instance_variables.each do |ivar| instance_variable_set(ivar, nil) end load_hash(hash) build(input_shape) end |
#forward(input_tensor) ⇒ Tensor
Forward propagation.
71 72 73 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 71 def forward(input_tensor) raise NotImplementedError, "Class '#{self.class.name}' has implement method 'forward'" end |
#load_hash(hash) ⇒ Object
89 90 91 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 89 def load_hash(hash) initialize end |
#output_shape ⇒ Array
Please reimplement this method as needed. The default implementation return input_shape.
78 79 80 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 78 def output_shape @input_shape end |
#to_hash(merge_hash = nil) ⇒ Object
Layer to a hash.
83 84 85 86 87 |
# File 'lib/dnn/core/layers/basic_layers.rb', line 83 def to_hash(merge_hash = nil) hash = { class: self.class.name } hash.merge!(merge_hash) if merge_hash hash end |