OPERATORS.md

April 4, 2023 ยท View on GitHub

OperatorDescription
Absy = (x > 0) ? x : -x
ArgMaxmax value index
AttentionMasktransformer local attention mask
Attentiontransformer global attention mask
BatchNormy = (x - mean) / sqrt(variance + eps) per channel
BatchToSpaceNdtensorflow batch_to_space function
BilateralSliceApplyhdrnet BilateralSliceApply function, you can use it by using caffe or onnx self-defined operator, please refer to inference/examples/bilateral_slice_apply/README.md
Castchange tensor data type
Ceily = ceil(x)
ChannelResizechannel padding or channel cut
Checkelement level compare, same as onnx Greater, GreaterOrEqual, Equal, LowerOrEqual, Lower
Clipy = clip(x, min, max)
Concatmany tensors concat on some axis
ConstantOfShapeallocate memory(not implement)
Constantonnx constant
ConvertColorYUV_NV21 <-> RGB,BGR,RGBA,BGRA, you can use it by using caffe or onnx self-defined operator, please refer to inference/examples/convert_color/README.md
Convolutioncommon 1D&2D&3D convolution, dilated 1D&2D&3D convolution, group 1D&2D&3D convolution, depthwise 1D&2D convolution
Copymemory copy
Cosy = cos(x)
Cumprefix function, currently support cumsum, cumprod
Deconvolution1D&2D deconvolution, onnx ConvTranspose
Depth2Spacetensorflow depth_to_space function
DetectionOutputSSD caffe DetectionOutput
Dropoutdropout function
Einsumsame as onnx einsum
Eluelu activation function
Eltwisesum, min, max, mul(prod), sub, div elementwise operation
EmbeddingCaffe embedding
Equalelementwise tensor compare, same as onnx equal, this also support tflite NOT_EQUAL, Equal is replaced with Check
Erferf(x) = 2/sqrt(pi) * integral from 0 to x of exp(-t^2) dt
Expandonnx expand
Expy = exp(x)
Flattensame as onnx flatten
Floory = floor(x)
FullyConnectedonnx Gemm, Linear
GATgraph attention module
Gatheronnx gather, gather_elements, gatherND, also same as embedding
Gelugelu activation
GenerateProposalssame as tf tf.image.generate_bounding_box_proposals
Greaterelementwise tensor compare, same as onnx greater
GridSamplesame as onnx grid_sample
HSigmoidhard sigmoid, y = clip((x + 1) / 2, 0, 1)
HSwishNoDivy = x * relu6(x + 3)
HSwishy = x * relu6(x + 3) / 6
InstanceNormInstance Normalization
Jumpif statement for dynamic control flow
L2NormalizationL2 Normalization
LayerNormlayernorm
LeakyRelurelu(scale != 0 when x < 0)
LogSoftmaxlog softmax
Logy = log(x)
Matmulmatrix multiply
Mishy = x * tanh(log(1 + e ^ x))
MultiHeadAttentiontransformer multi-head attention
Negy = -x
NonMaxSuppressionsame as onnx non max suppression
NonZerosame as onnx non zero
Noty = ! (x), same as onnx not
OneHotsame as onnx one hot
Padconstant(0), reflect, edge, symmetric padding
Poolingmax, mean pooling
Powery = (scale * x + shift) ^ pow
PreAllocatedMemoryallocate memory
Preluprelu activation
PriorBoxSSD caffe PriorBox
QuantizeLinearint8 quantization
Randomrandom function, currently support uniform and normal random
Rangesame as onnx range
Reciprocalsame as onnx reciprocal, y = 1 / x
Reductionsum, min, max, mean reduction
RelativePositionEmbeddingself-defined relative position embedding operator
RelativeShiftself-defined relative shift operator
Relu6y = relu6(x)
Relurelu(scale = 0 when x < 0)
Repeatdo while loop for dynamic control flow
Reshapechange dimension
Resizelinear, nearest, cubic mode resize, same as onnx Resize, Upsample
RNNLSTM, PLSTM, GRU, onnx LBR GRU, onnx Scan, also supports bi-direction
RoIAlignsame as onnx RoIAlign
Roundy = round(x)
Scaley = alpha * x + beta per channel
Scatteronnx scatter, scatter_elements, scatterND
Selecty = choice ? a : b, same as tflite select
Shapeget tensor shape
SharedWeightused to represent onnx/tflite operator input that is not generated by another operator
Sigmoidsigmoid activation
Signy = sign(x)
Siny = sin(x)
Slicecaffe slice
SoftmaxWithLosssoftmax with loss(not implement)
Softmaxy = exp(x - max(x)) / sum(exp(x - max(x)))
SoftPlusy = log(1 + e ^ x)
Space2Depthtensorflow space_to_depth function
SpaceToBatchNdtensorflow space_to_batch function
SpliceKaldi extract feature function, same as Gather
Splitsame as onnx split
SqDifftflite squared difference
Squeezeremove 1 dimension
Swishy = x * exp(x)
TanHy = tanh(x)
TdnnKaldi tdnn operator(Splice + Linear)
TfSliceonnx or tflite slice, strided slice
Tileonnx tile
TopKsame as onnx topk
Transposetranspose data, same as caffe permute
UnPoolingsame as onnx unpooling
Unsqueezeadd 1 dimension
Wheresame as onnx where
Yolov3DetectionOutputYolov3 caffe detectionOutput