ONNX

import sol.onnx as sol
import numpy as np

model  = sol.optimize("myModel.onnx") # no input description needed, as provided by model itself!
input  = np.random.rand(1, 3, 224, 224)
output = model(input)

In case that the ONNX model was compiled with variable batch sizes (i.e. using PyTorch’s dynamic_axes) you need to set sol.optimize(..., batch_size=BS) with BS being an int, used for SOL’s heuristics during code generation.

For executing the model on other devices set sol.device.set(sol.device.TYPE, DEVICE_IDX) prior calling model(input).

Deploying ONNX Models

See Deployment.

Tested Networks

  • TorchVision
    • Alexnet
    • SqueezeNet (1.0, 1.1)
    • VGG (11, 13, 16, 19, w/ and w/o batchnorm)
    • ResNet (18, 34, 50, 101, 152)
    • DenseNet (121, 161, 169, 201)
    • MNasNet (0.5, 0.75, 1.0, 1.3)
    • ShuffleNet v2 (0.5, 1.0, 1.5, 2.0)
    • ResNext (50, 101)
    • WideResNet (50, 101)
  • MobileNet

Supported Layers

Please refer to https://github.com/onnx/onnx/blob/master/docs/Operators.md for how these functions are used. This documentation only contains which layers, functions and tensor functionality are currently implemented within SOL.

Abs
Acos
Acosh
Add
And
Asin
Asinh
Atan
Atanh
AveragePool
BatchNormalization
Cast
Ceil
Celu
Clip
Concat
Constant
ConstantOfShape
Conv
Cos
Cosh
Div
Dropout
Elu
Equal
Erf
Exp
Expand
Flatten
Floor
Gather
Gemm
GlobalAveragePool
GlobalMaxPool
Greater
GreaterOrEqual
Identity
LeakyRelu
Less
LessOrEqual
LogSoftmax
MatMul
Max
MaxPool
Min
Mod
Mul
Neg
Not
Or
PRelu
Pad
Pow
Range
Reciprocal
ReduceMax
ReduceMean
ReduceMin
ReduceProd
ReduceSum
ReduceSumSquare
Relu
Reshape
Selu
Shape
Shrink
Sigmoid
Sign
Sin
Sinh
Softmax
Softmin
Softplus
Softsign
Split
Sqrt
Squeeze
Sub
Sum
Tan
Tanh
Transpose
Unsqueeze
Where
Xor