Versioned name: BatchNormInference-1
Category: Normalization
Short description: BatchNormInference layer normalizes a input
tensor by mean
and variance
, and applies a scale (gamma
) to it, as well as an offset (beta
).
Attributes:
float
Inputs
input
- input tensor with data for normalization. At least a 2D tensor of type T, the second dimension represents the channel axis and must have a span of at least 1. Required.gamma
- gamma scaling for normalized value. A 1D tensor of type T with the same span as input's channel axis. Required.beta
- bias added to the scaled normalized value. A 1D tensor of type T with the same span as input's channel axis.. Required.mean
- value for mean normalization. A 1D tensor of type T with the same span as input's channel axis.. Required.variance
- value for variance normalization. A 1D tensor of type T with the same span as input's channel axis.. Required.Outputs
Types
Mathematical Formulation
BatchNormInference normalizes the output in each hidden layer.
\[ \beta = \{ x_{1...m} \} \]
\[ \{ o_{i} = BN_{\gamma, \beta} ( b_{i} ) \} \]
\[ \mu_{\beta} \leftarrow \frac{1}{m}\sum_{i=1}^{m}b_{i} \]
\[ \sigma_{\beta }^{2}\leftarrow \frac{1}{m}\sum_{i=1}^{m} ( b_{i} - \mu_{\beta} )^{2} \]
\[ \hat{b_{i}} \leftarrow \frac{b_{i} - \mu_{\beta}}{\sqrt{\sigma_{\beta }^{2} + \epsilon }} \]
\[ o_{i} \leftarrow \gamma\hat{b_{i}} + \beta = BN_{\gamma ,\beta } ( b_{i} ) \]
Example