In this article
- Activations
- [function] !sigmoid
- [function] !hard-sigmoid
- [function] !relu
- [function] !leaky-relu
- [function] !log-softmax
- [function] !elu
- [function] !relu6
- [function] !softmax
- [function] !softplus
- [function] !softsign
- [function] !softshrink
- [function] !celu
- [function] !silu
- [function] !logsigmoid
- [function] !gelu
- [function] !selu
- [function] !mish
- [function] !hardswish
- [function] !hardtanh
- [function] !softmin
Activation
Activations
[function] !sigmoid
Applies the Sigmoid function element-wise to the input tensor.[function] !hard-sigmoid
Applies the HardSigmoid function element-wise to the input tensor.[function] !relu
Applies the ReLU function element-wise to the input tensor.[function] !leaky-relu
Applies the LeakyReLU function element-wise to the input tensor.[function] !log-softmax
Applies the LogSoftmax function element-wise to the input tensor.[function] !elu
Applies the ELU function element-wise to the input tensor.[function] !relu6
Applies the ReLU6 function element-wise to the input tensor.[function] !softmax
Applies the Softmax function to the input tensor.[function] !softplus
Applies the Softplus function element-wise to the input tensor.[function] !softsign
Applies the Softsign function element-wise to the input tensor.[function] !softshrink
Applies the SoftShrink function element-wise to the input tensor.[function] !celu
Applies the CeLU function element-wise to the input tensor.[function] !silu
Applies the SiLU function element-wise to the input tensor.[function] !logsigmoid
Applies the LogSigmoid function element-wise to the input tensor.[function] !gelu
Applies the GeLU activation to the input tensor. There are two ways to approximate the GeLU function.:tanh
to use tanh. :sigmoid
to use sigmoid.