Class ov::op::internal::GLU#

class GLU : public ov::op::Op#

Operator performing Gated Linear Unit Activation This operation performs gated linear unit activation that combines swish or gelu activation function.

Public Functions

GLU(const Output<Node> &data, int64_t axis, int64_t split_lengths, const GluType glu_type, const size_t split_to_glu_idx, const ov::element::Type output_type = ov::element::dynamic)#

Constructs an GLU operation.

Parameters:
  • dataInput tensor with data

  • axis – The index of an axis in “data” along which to perform the split

  • split_lenghts – A list containing the sizes of each output tensor along the split “axis”

  • glu_typeGLU type, one of Swish, Gelu and Gelu_Tanh

  • split_to_glu_idxOutput index of variadic split, which is connected to GLU

  • output_typeOutput element type

virtual void validate_and_infer_types() override#

Verifies that attributes and inputs are consistent and computes output shapes and element types. Must be implemented by concrete child classes so that it can be run any number of times.

Throws if the node is invalid.