ONNX Export and Inference¶
ONNX export and inference for trained Learners.
OnnxInferenceWrapper ¶
Run an exported ONNX model with onnxruntime. Same API as InferenceWrapper.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
path
|
str | Path
|
path to the exported .onnx model file |
required |
session_options
|
additional keyword arguments forwarded to onnxruntime.InferenceSession |
{}
|
Source code in tsfast/inference/onnx.py
inference ¶
Run inference on numpy input, returns numpy output. Output ndim mirrors input ndim.
Source code in tsfast/inference/onnx.py
export_onnx ¶
Export a trained Learner's model to ONNX format with normalization baked in.