ONNX requirements
In previous sections we detailed how to create ONNX graphs from scratch (including providing a large number of examples in the
sclblonnx
package). Alternatively you can also export your trained model from you favourite training platform. In both of these cases, you will end up with an .onnx
file that you would like to upload to the Scailable platform for deployment. For the automatic conversion of the ONNX graph to SPMF we technically support a subset of ONNX. This page describes how to check your ONNX graph and ensure that it fits the Scailable platform requirements for deployment.Note that the
sclblonnx
package (git, pip), provides functions to automatically check locally whether or not your onnx
file adheres to the Scailable requirements.Here we list all the requirements that your ONNX graph should adhere to before uploading it to the Scailable platform:
- Inputs: As we currently (publicly) provide support solely for vision models (i.e., models that operate on images), ONNX graphs uploaded to our platform should have an image tensor as input. We support two options for the input tensor of the image input to the model:
(n)chw
: I.e., an encoding (optionally) the number of imagesn
(which should be set to 1), the number of color channelsc
(1 for grayscale, 3 for color) and next the widthw
and the height h of the impnut image.(n)hwc
: Similar to the above, but the order of channels and pixels changed.
Note that if the ONNX graph has a single input, the AI manager will try to fit the input image grabbed from the camera to this input. However, ideally, the image input to the graph is explicitly named and has aimage-
prefix (see also our named model inputs and outputs).
We recommend using Netron to visually inspect your ONNX graph to ensure that the input and outputs match the descriptions provided here.
- Operators: Within the ONNX graph we support almost all, but not all, ONNX operators. Please see our list of supported operators below (or easier, check adherence of the ONNX graph automatically using the
sclblonnx
package. - Graph size: We impose limits on the file size of the uploaded ONNX graph. For trial accounts, we impose a limit of max 25Mb. for the full ONNX graph (simply check the file size of the
.onnx
file you intend to upload to our platform. For licensed platform accounts we currently impose a size limit of 250Mb.
If you have a licensed account and would like to upload larger models, please reach out to our support team using our chat and we will increase the limit for your account.
Please bear in mind that you will be deploying your AI model and pipeline specified in your ONNX graph to an edge device with contstrained resources. Always test the performance of your pipeline on your target device before production deployment.
- Data type(s): We support all data types in the ONNX graph with the exception of STRING. Furthermore, we currently only support static (fixed) sized tensors throughout the graph. P
- Output: We impose no constraints (other then those already imposed by the data types) on the outputs of the ONNX graph. However, some output names will trigger contextual post-processing within the AI manager.
- Parsing: Potentially rundundant, but obviously the ONNX graph needs to be valid and produce the appropriate output. Please run a local test before uploading the ONNX graph to the Scailable platform.
You can use the
sclblonnx
package to provide automatic checks of your ONNX graph. Please always run the following before uploading:You can easily clean up your graph (which also tries to optimise the graph wherever possible).
import sclblonnx as so
graph = so.clean(graph)
The
check
function automatically checks whether the ONNX graph adheres to our requirements:import sclblonnx as so
so.check(graph)
You should always test your graph locally:
import sclblonnx as so
​
// code to construct graph g with outuput node sum and input example e.
​
result = so.run(g,
inputs=e,
outputs=["sum"]
)
print(result)
Here we provide a list of all the operators we support for ONNX to SPMF conversion:
Operator | Opset 11/13 |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
​ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
​ | |
​ | |
​ | |
âœ”ï¸ | |
âœ”ï¸ | |
​ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
​ | |
âœ”ï¸ | |
​ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
​ | |
​ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
​ | |
​ | |
​ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
​ | |
​ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
​ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
​ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
​ | |
​ | |
âœ”ï¸ | |
​ | |
​ | |
​ | |
​ | |
​ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
​ | |
​ | |
âœ”ï¸ | |
​ | |
Scatter | âœ”ï¸ |
âœ”ï¸ | |
âœ”ï¸ | |
âœ”ï¸ | |
​ | |