Method: Transformers::GELUActivation#_gelu_python
- Defined in:
- lib/transformers/activations.rb
#_gelu_python(input) ⇒ Object
26 27 28 |
# File 'lib/transformers/activations.rb', line 26 def _gelu_python(input) input * 0.5 * (1.0 + Torch.erf(input / Math.sqrt(2.0))) end |