Shortcuts

SiLU

class torch.nn.SiLU[source]

Applies the silu function, element-wise.

Note

See Gaussian Error Linear Units (GELUs) where the SiLU (Sigmoid Linear Unit) was originally coined, and see Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning and Swish: a Self-Gated Activation Function where the SiLU was experimented with later.

silu(x)=xσ((x))whereσ(x)is the logistic sigmoid.\text{silu}(x) = x * \sigma((x)) \text{where} \sigma(x) \text{is the logistic sigmoid.}
Shape:
  • Input: (N,)(N, *) where * means, any number of additional dimensions

  • Output: (N,)(N, *) , same shape as the input

Examples:

>>> m = nn.SiLU()
>>> input = torch.randn(2)
>>> output = m(input)

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources