Skip to content

A new interface for running ML inference

Xiangyang Ju requested to merge xju/athena:onnx_infer into main

Yet another interface for running ML inference with OnnxRuntime (ORT) is introduced, Control/AthOnnx/AthOnnxInterfaces/AthOnnxInterfaces/IAthInferenceTool.h. This interface is designed so it can be used by the Inference as-a-Service (IaaS) approach when the Trition Client is available.

An example is created for the ORT: Control/AthenaExamples/AthExOnnxRuntime/src/EvaluateModelWithAthInfer.cxx. The plan is to use the same example for the IaaS. Along the way, the code in the example is refactorized and cleaned up.

The namespace in AthOnnxUtils is changed to AthOnnxUtils, making clear which functions are from this package.

Edited by Xiangyang Ju

Merge request reports

Loading