Class: DNN::Layers::Connection
- Inherits:
-
HasParamLayer
- Object
- Layer
- HasParamLayer
- DNN::Layers::Connection
- Defined in:
- lib/dnn/core/layers.rb
Overview
It is a superclass of all connection layers.
Instance Attribute Summary collapse
-
#l1_lambda ⇒ Object
readonly
L1 regularization.
-
#l2_lambda ⇒ Object
readonly
L2 regularization.
Attributes inherited from HasParamLayer
Instance Method Summary collapse
- #dlasso ⇒ Object
- #dridge ⇒ Object
-
#initialize(weight_initializer: Initializers::RandomNormal.new, bias_initializer: Initializers::Zeros.new, l1_lambda: 0, l2_lambda: 0) ⇒ Connection
constructor
A new instance of Connection.
- #lasso ⇒ Object
- #ridge ⇒ Object
- #to_hash(merge_hash) ⇒ Object
Methods inherited from HasParamLayer
Methods inherited from Layer
#backward, #build, #built?, #forward, #prev_layer, #shape
Constructor Details
#initialize(weight_initializer: Initializers::RandomNormal.new, bias_initializer: Initializers::Zeros.new, l1_lambda: 0, l2_lambda: 0) ⇒ Connection
Returns a new instance of Connection.
117 118 119 120 121 122 123 124 125 126 127 128 |
# File 'lib/dnn/core/layers.rb', line 117 def initialize(weight_initializer: Initializers::RandomNormal.new, bias_initializer: Initializers::Zeros.new, l1_lambda: 0, l2_lambda: 0) super() @weight_initializer = weight_initializer @bias_initializer = bias_initializer @l1_lambda = l1_lambda @l2_lambda = l2_lambda @params[:weight] = @weight = LearningParam.new @params[:bias] = @bias = LearningParam.new end |
Instance Attribute Details
#l1_lambda ⇒ Object (readonly)
L1 regularization
114 115 116 |
# File 'lib/dnn/core/layers.rb', line 114 def l1_lambda @l1_lambda end |
#l2_lambda ⇒ Object (readonly)
L2 regularization
115 116 117 |
# File 'lib/dnn/core/layers.rb', line 115 def l2_lambda @l2_lambda end |
Instance Method Details
#dlasso ⇒ Object
146 147 148 149 150 151 152 |
# File 'lib/dnn/core/layers.rb', line 146 def dlasso if @l1_lambda > 0 dlasso = Xumo::SFloat.ones(*@weight.data.shape) dlasso[@weight.data < 0] = -1 @weight.grad += @l1_lambda * dlasso end end |
#dridge ⇒ Object
154 155 156 157 158 |
# File 'lib/dnn/core/layers.rb', line 154 def dridge if @l2_lambda > 0 @weight.grad += @l2_lambda * @weight.data end end |
#lasso ⇒ Object
130 131 132 133 134 135 136 |
# File 'lib/dnn/core/layers.rb', line 130 def lasso if @l1_lambda > 0 @l1_lambda * @weight.data.abs.sum else 0 end end |
#ridge ⇒ Object
138 139 140 141 142 143 144 |
# File 'lib/dnn/core/layers.rb', line 138 def ridge if @l2_lambda > 0 0.5 * @l2_lambda * (@weight.data**2).sum else 0 end end |
#to_hash(merge_hash) ⇒ Object
160 161 162 163 164 165 |
# File 'lib/dnn/core/layers.rb', line 160 def to_hash(merge_hash) super({weight_initializer: @weight_initializer.to_hash, bias_initializer: @bias_initializer.to_hash, l1_lambda: @l1_lambda, l2_lambda: @l2_lambda}.merge(merge_hash)) end |