Operators ========= .. contents:: :depth: 2 :local: Operator base class ------------------- ``Operator`` is Aidge's base class for describing a mathematical Operator. It does not make any assumption on the data coding. .. tab-set:: .. tab-item:: Python .. autoclass:: aidge_core.Operator :members: :inherited-members: .. tab-item:: C++ .. doxygenclass:: Aidge::Operator OperatorTensor base class ------------------------- ``OperatorTensor`` derives from the ``Operator`` base class and is the base class for any tensor-based operator. .. tab-set:: .. tab-item:: Python .. autoclass:: aidge_core.OperatorTensor :members: :inherited-members: .. tab-item:: C++ .. doxygenclass:: Aidge::OperatorTensor Generic Operator ---------------- A generic tensor-based operator can be used to model any kind of mathematical operator that takes a defined number of inputs, produces a defined number of outputs and can have some attributes. It is possible to provide a function that produces the output tensors size w.r.t. the inputs size. It has a default consumer-producer model (require and consume all inputs full tensors and produces output full tensors). This is the default operator used for unsupported ONNX operators when loading an ONNX model. While it obviously cannot be executed, a generic operator has still some usefulness: - It allows loading any graph even with unknown operators. It is possible to identify exactly all the missing operator types and their position in the graph; - It can be searched and manipuled with graph matching, allowing for example to replace it with alternative operators; - It can be scheduled and included in the graph static scheduling; - 🚧 A custom implementation may be provided in the future, even in pure Python, for rapid integration and prototyping. .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.GenericOperator .. tab-item:: C++ .. doxygenfunction:: Aidge::GenericOperator Meta Operator ------------- A meta-operator (or composite operator) is internally built from a sub-graph. .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.meta_operator .. tab-item:: C++ .. doxygenfunction:: Aidge::MetaOperator Building a new meta-operator is simple: .. code-block:: c++ auto graph = Sequential({ Pad<2>(padding_dims, (!name.empty()) ? name + "_pad" : ""), MaxPooling(kernel_dims, (!name.empty()) ? name + "_maxpooling" : "", stride_dims, ceil_mode) }); return MetaOperator("PaddedMaxPooling2D", graph, name); You can use the :ref:`Expand meta operators recipe ` to flatten the meta-operators in a graph. Predefined operators -------------------- Add ~~~ .. jinja:: AddOp_in_out :file: jinja/op_in_out_mmd.jinja :header_char: - .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.Add .. tab-item:: C++ .. doxygenfunction:: Aidge::Add Average Pooling ~~~~~~~~~~~~~~~ .. jinja:: AvgPoolingOp2D_in_out :file: jinja/op_in_out_mmd.jinja :header_char: - .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.AvgPooling1D .. autofunction:: aidge_core.AvgPooling2D .. autofunction:: aidge_core.AvgPooling3D .. tab-item:: C++ .. doxygenfunction:: Aidge::AvgPooling(DimSize_t const (&kernel_dims)[DIM],const std::string& name = "",const std::array &stride_dims = create_array(1)) BatchNorm ~~~~~~~~~ .. jinja:: BatchNormOp2D_in_out :file: jinja/op_in_out_mmd.jinja :header_char: - .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.BatchNorm2D .. tab-item:: C++ .. doxygenfunction:: Aidge::BatchNorm Cast ~~~~ .. tab-set:: .. tab-item:: Python Not available yet ! .. tab-item:: C++ .. doxygenfunction:: Aidge::Cast Concat ~~~~~~ .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.Concat .. tab-item:: C++ .. doxygenfunction:: Aidge::Concat Conv ~~~~ .. jinja:: ConvOp2D_in_out :file: jinja/op_in_out_mmd.jinja :header_char: - .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.Conv1D .. autofunction:: aidge_core.Conv2D .. autofunction:: aidge_core.Conv3D .. tab-item:: C++ .. doxygenfunction:: Aidge::Conv(DimSize_t in_channels, DimSize_t out_channels, DimSize_t const (&kernel_dims)[DIM], const std::string& name = "", const std::array &stride_dims = create_array(1), const std::array &dilation_dims = create_array(1), bool noBias = false) ConvDepthWise ~~~~~~~~~~~~~ .. jinja:: ConvDepthWiseOp2D_in_out :file: jinja/op_in_out_mmd.jinja :header_char: - .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.ConvDepthWise1D .. autofunction:: aidge_core.ConvDepthWise2D .. autofunction:: aidge_core.ConvDepthWise3D .. tab-item:: C++ .. doxygenfunction:: Aidge::ConvDepthWise(const DimSize_t nbChannels, DimSize_t const (&kernelDims)[DIM], const std::string &name = "", const std::array &strideDims = create_array(1), const std::array &dilationDims = create_array(1), bool noBias = false) Div ~~~ .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.Div .. tab-item:: C++ .. doxygenfunction:: Aidge::Div Erf ~~~ .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.Erf .. tab-item:: C++ .. doxygenfunction:: Aidge::Erf FC ~~ .. jinja:: FCOp_in_out :file: jinja/op_in_out_mmd.jinja :header_char: - .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.FC .. tab-item:: C++ .. doxygenfunction:: Aidge::FC Gather ~~~~~~ .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.Gather .. tab-item:: C++ .. doxygenfunction:: Aidge::Gather Identity ~~~~~~~~ .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.Identity .. tab-item:: C++ .. doxygenfunction:: Aidge::Identity LeakyReLU ~~~~~~~~~ .. jinja:: LeakyReLUOp_in_out :file: jinja/op_in_out_mmd.jinja :header_char: - .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.LeakyReLU .. tab-item:: C++ .. doxygenfunction:: Aidge::LeakyReLU MatMul ~~~~~~ .. jinja:: MatMulOp_in_out :file: jinja/op_in_out_mmd.jinja :header_char: - .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.MatMul .. tab-item:: C++ .. doxygenfunction:: Aidge::MatMul Memorize ~~~~~~~~ .. tab-set:: .. tab-item:: Python Not available yet ! .. tab-item:: C++ .. doxygenfunction:: Aidge::Memorize Move ~~~~ .. tab-set:: .. tab-item:: Python Not available yet ! .. tab-item:: C++ .. doxygenfunction:: Aidge::Move Mul ~~~ .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.Mul .. tab-item:: C++ .. doxygenfunction:: Aidge::Mul Pad ~~~ .. tab-set:: .. tab-item:: Python Not available yet ! .. tab-item:: C++ .. doxygenfunction:: Aidge::Pad(const std::array &beginEndTuples, const std::string &name = "", const PadBorderType &borderType = PadBorderType::Constant, double borderValue = 0.0) Pop ~~~ .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.Pop .. tab-item:: C++ .. doxygenfunction:: Aidge::Pop Pow ~~~ .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.Pow .. tab-item:: C++ .. doxygenfunction:: Aidge::Pow Producer ~~~~~~~~ .. jinja:: ProducerOp_in_out :file: jinja/op_in_out_mmd.jinja :header_char: - .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.Producer .. tab-item:: C++ .. doxygenfunction:: Aidge::Producer(const std::shared_ptr tensor, const std::string &name = "", bool constant = false) .. doxygenfunction:: Aidge::Producer(const std::array &dims, const std::string &name = "", bool constant = false) ReduceMean ~~~~~~~~~~ .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.ReduceMean .. tab-item:: C++ .. doxygenfunction:: Aidge::ReduceMean ReLU ~~~~ .. jinja:: ReLUOp_in_out :file: jinja/op_in_out_mmd.jinja :header_char: - .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.ReLU .. tab-item:: C++ .. doxygenfunction:: Aidge::ReLU Reshape ~~~~~~~ .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.Reshape .. tab-item:: C++ .. doxygenfunction:: Aidge::Reshape Scaling ~~~~~~~ .. tab-set:: .. tab-item:: Python Not available yet ! .. tab-item:: C++ .. doxygenfunction:: Aidge::Scaling Sigmoid ~~~~~~~ .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.Sigmoid .. tab-item:: C++ .. doxygenfunction:: Aidge::Sigmoid Slice ~~~~~ .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.Slice .. tab-item:: C++ .. doxygenfunction:: Aidge::Slice Softmax ~~~~~~~ .. jinja:: SoftmaxOp_in_out :file: jinja/op_in_out_mmd.jinja :header_char: - .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.Softmax .. tab-item:: C++ .. doxygenfunction:: Aidge::Softmax Sqrt ~~~~ .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.Sqrt .. tab-item:: C++ .. doxygenfunction:: Aidge::Sqrt Sub ~~~ .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.Sub .. tab-item:: C++ .. doxygenfunction:: Aidge::Sub Tanh ~~~~ .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.Tanh .. tab-item:: C++ .. doxygenfunction:: Aidge::Tanh Transpose ~~~~~~~~~ .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.Transpose .. tab-item:: C++ .. doxygenfunction:: Aidge::Transpose Predefined meta-operators ------------------------- Some meta-operators (or composite operators) are predefined for conveniance and/or for compatibility with others frameworks. PaddedConv ~~~~~~~~~~ .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.PaddedConv2D .. autofunction:: aidge_core.PaddedConv3D .. tab-item:: C++ .. doxygenfunction:: Aidge::PaddedConv(DimSize_t in_channels, DimSize_t out_channels, DimSize_t const (&kernel_dims)[DIM], const std::string& name = "", const std::array &stride_dims = create_array(1), const std::array &padding_dims = create_array(0), const std::array &dilation_dims = create_array(1), bool no_bias = false) PaddedConvDepthWise ~~~~~~~~~~~~~~~~~~~ .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.PaddedConvDepthWise2D .. autofunction:: aidge_core.PaddedConvDepthWise3D .. tab-item:: C++ .. doxygenfunction:: Aidge::PaddedConvDepthWise(const DimSize_t nb_channels, const std::array &kernel_dims, const std::string& name = "", const std::array &stride_dims = create_array(1), const std::array &padding_dims = create_array(0), const std::array &dilation_dims = create_array(1), bool no_bias = false) PaddedAvgPooling ~~~~~~~~~~~~~~~~ .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.PaddedAvgPooling2D .. autofunction:: aidge_core.PaddedAvgPooling3D .. tab-item:: C++ .. doxygenfunction:: Aidge::PaddedAvgPooling(DimSize_t const (&kernel_dims)[DIM], const std::string& name = "", const std::array &stride_dims = create_array(1), const std::array &padding_dims = create_array(0)) PaddedMaxPooling ~~~~~~~~~~~~~~~~ .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.PaddedMaxPooling2D .. autofunction:: aidge_core.PaddedMaxPooling3D .. tab-item:: C++ .. doxygenfunction:: Aidge::PaddedMaxPooling(const std::array &kernel_dims, const std::string &name = "", const std::array &stride_dims = create_array(1), const std::array &padding_dims = create_array(0), bool ceil_mode = false) LSTM ~~~~ .. tab-set:: .. tab-item:: Python .. autofunction:: aidge_core.LSTM .. tab-item:: C++ .. doxygenfunction:: Aidge::LSTM