Versioned name : HardSigmoid-1

Category : Activation function

Short description : HardSigmoid element-wise activation function.

Attributes : HardSigmoid operation has no attributes.

Mathematical Formulation

For each element from the input tensor calculates corresponding element in the output tensor with the following formula:

\[y = max(0,\ min(1,\ \alpha x + \beta))\]

where α corresponds to alpha scalar input and β corresponds to beta scalar input.


  • 1 : An tensor of type T. Required.

  • 2 : alpha 0D tensor (scalar) of type T. Required.

  • 3 : beta 0D tensor (scalar) of type T. Required.


  • 1 : The result of the hard sigmoid operation. A tensor of type T.


  • T : any floating-point type.


<layer ... type="HardSigmoid">
        <port id="0">
        <port id="1"/>
        <port id="2"/>
        <port id="3">