Caten Documentation

  • Home
  • Quickstart
  • Development
  • API Reference
    • caten/air
    • caten/aasm
    • caten/codegen
    • caten/api
      • Overview
      • Tensor
      • Func
      • Module
      • Model
      • Initializers
      • ShapeTracker
      • Facet API
      • StateDict
    • caten/nn
      • Activation
      • Convolution
      • Criterion
      • Embedding
      • Linear
      • Normalization
      • Padding
      • Pooling
      • Encoding
      • Optimizers
  • Ready to use packages
    • Overview
    • caten/apps.gpt2
  • External Packages
    • caten/gguf
    • caten/oonx
    • caten/llm
In this article
  • Activations
    • [function] !sigmoid
    • [function] !hard-sigmoid
    • [function] !relu
    • [function] !leaky-relu
    • [function] !log-softmax
    • [function] !elu
    • [function] !relu6
    • [function] !softmax
    • [function] !softplus
    • [function] !softsign
    • [function] !softshrink
    • [function] !celu
    • [function] !silu
    • [function] !logsigmoid
    • [function] !gelu
    • [function] !selu
    • [function] !mish
    • [function] !hardswish
    • [function] !hardtanh
    • [function] !softmin

Activation

  1. Caten Documentation
  2. API Reference
  3. caten/nn
  4. Activation
|
  • Share via

  •  Edit this article

Activations

[function] !sigmoid

(!sigmoid x)
Applies the Sigmoid function element-wise to the input tensor.

[function] !hard-sigmoid

(!hard-sigmoid x &key (alpha 0.2) (beta 0.5))
Applies the HardSigmoid function element-wise to the input tensor.

[function] !relu

(!relu x)
Applies the ReLU function element-wise to the input tensor.

[function] !leaky-relu

(!leaky-relu x &key (neg-slope 1e-3))
Applies the LeakyReLU function element-wise to the input tensor.

[function] !log-softmax

(!log-softmax x &key (axis -1))
Applies the LogSoftmax function element-wise to the input tensor.

[function] !elu

(!elu x &key (alpha 1.0))
Applies the ELU function element-wise to the input tensor.

[function] !relu6

(!relu6 x)
Applies the ReLU6 function element-wise to the input tensor.

[function] !softmax

(!softmax x &key (axis -1))
Applies the Softmax function to the input tensor.

[function] !softplus

(!softplus x &key (beta 1.0))
Applies the Softplus function element-wise to the input tensor.

[function] !softsign

(!softsign x)
Applies the Softsign function element-wise to the input tensor.

[function] !softshrink

(!softshrink x &key (lmd 0.5))
Applies the SoftShrink function element-wise to the input tensor.

[function] !celu

(!celu x &key (alpha 1.0))
Applies the CeLU function element-wise to the input tensor.

[function] !silu

(!silu x)
Applies the SiLU function element-wise to the input tensor.

[function] !logsigmoid

(!logsigmoid x)
Applies the LogSigmoid function element-wise to the input tensor.

[function] !gelu

(!gelu x &key (approx :tanh))
Applies the GeLU activation to the input tensor. There are two ways to approximate the GeLU function. :tanh to use tanh. :sigmoid to use sigmoid.

[function] !selu

(!selu x)
Applies the SeLU function element-wise to the input tensor.

[function] !mish

(!mish x)
Applies the Mish function element-wise to the input tensor.

[function] !hardswish

(!hardswish x)
Applies the HardSwish function element-wise to the input tensor.

[function] !hardtanh

(!hardtanh x &key (min_val -1.0) (max_val 1.0))
Applies the HardTanh function element-wise to the input tensor.

[function] !softmin

(!softmin x)
Applies the Softmin function element-wise to the input tensor.

Search
Enter a keyword to search.