Class: DNN::Layers::Layer

Inherits:
Object
  • Object
show all
Defined in:
lib/dnn/core/layers.rb

Overview

Super class of all layer classes.

Instance Attribute Summary collapse

Class Method Summary collapse

Instance Method Summary collapse

Constructor Details

#initializeLayer

Returns a new instance of Layer.



23
24
25
26
# File 'lib/dnn/core/layers.rb', line 23

def initialize
  @built = false
  @name = nil
end

Instance Attribute Details

#input_shapeObject (readonly)

Returns the value of attribute input_shape.



7
8
9
# File 'lib/dnn/core/layers.rb', line 7

def input_shape
  @input_shape
end

#nameObject

Returns the value of attribute name.



6
7
8
# File 'lib/dnn/core/layers.rb', line 6

def name
  @name
end

Class Method Details

.call(x, *args) ⇒ Object



9
10
11
# File 'lib/dnn/core/layers.rb', line 9

def self.call(x, *args)
  new(*args).(x)
end

.from_hash(hash) ⇒ Object

Raises:



13
14
15
16
17
18
19
20
21
# File 'lib/dnn/core/layers.rb', line 13

def self.from_hash(hash)
  return nil unless hash
  layer_class = DNN.const_get(hash[:class])
  layer = layer_class.allocate
  raise DNN_Error, "#{layer.class} is not an instance of #{self} class." unless layer.is_a?(self)
  layer.load_hash(hash)
  layer.name = hash[:name]&.to_sym
  layer
end

Instance Method Details

#backward(dy) ⇒ Object

Backward propagation.

Parameters:

  • dy (Numo::SFloat)

    Differential value of output data.

Raises:

  • (NotImplementedError)


60
61
62
# File 'lib/dnn/core/layers.rb', line 60

def backward(dy)
  raise NotImplementedError, "Class '#{self.class.name}' has implement method 'backward'"
end

#build(input_shape) ⇒ Object

Build the layer.

Parameters:

  • input_shape (Array)

    Setting the shape of the input data.



42
43
44
45
# File 'lib/dnn/core/layers.rb', line 42

def build(input_shape)
  @input_shape = input_shape
  @built = true
end

#built?Boolean

Returns If layer have already been built then return true.

Returns:

  • (Boolean)

    If layer have already been built then return true.



48
49
50
# File 'lib/dnn/core/layers.rb', line 48

def built?
  @built
end

#call(input_tensor) ⇒ Tensor

Forward propagation and create a link.

Parameters:

  • input_tensor (Tensor)

    Input tensor.

Returns:



31
32
33
34
35
36
37
38
# File 'lib/dnn/core/layers.rb', line 31

def call(input_tensor)
  x = input_tensor.data
  prev_link = input_tensor.link
  build(x.shape[1..-1]) unless built?
  y = forward(x)
  link = Link.new(prev_link, self)
  Tensor.new(y, link)
end

#forward(x) ⇒ Object

Forward propagation.

Parameters:

  • x (Numo::SFloat)

    Input data.

Raises:

  • (NotImplementedError)


54
55
56
# File 'lib/dnn/core/layers.rb', line 54

def forward(x)
  raise NotImplementedError, "Class '#{self.class.name}' has implement method 'forward'"
end

#load_hash(hash) ⇒ Object



78
79
80
# File 'lib/dnn/core/layers.rb', line 78

def load_hash(hash)
  initialize
end

#output_shapeArray

Please reimplement this method as needed. The default implementation return input_shape.

Returns:

  • (Array)

    Return the shape of the output data.



67
68
69
# File 'lib/dnn/core/layers.rb', line 67

def output_shape
  @input_shape
end

#to_hash(merge_hash = nil) ⇒ Object

Layer to a hash.



72
73
74
75
76
# File 'lib/dnn/core/layers.rb', line 72

def to_hash(merge_hash = nil)
  hash = { class: self.class.name, name: @name }
  hash.merge!(merge_hash) if merge_hash
  hash
end