############## Backend module ############## The module provides a unified interface for running inference and implementations of different types of backends. Each concrete backend wraps some execution provider or technology, so you don't need to do anything special, just create ``Backend`` instance and use it. There are also preconfigured backends (presets) that can be useful for finding the optimal backend and its parameters. ***************** Backend interface ***************** Interface for running inference. All backends implemented in **ENOT Lite** framework follow this basic interface. .. autoclass:: enot_lite.backend.backend.Backend :members: ********************* ORT Backend interface ********************* Interface for backends based on ONNX Runtime. .. autoclass:: enot_lite.backend.ort.OrtBackend :members: *************** ORT CPU Backend *************** .. autoclass:: enot_lite.backend.ort_cpu.OrtCpuBackend :members: **************** ORT CUDA Backend **************** .. autoclass:: enot_lite.backend.ort_cuda.OrtCudaBackend :members: ******************** ORT OpenVINO Backend ******************** .. autoclass:: enot_lite.backend.ort_openvino.OrtOpenvinoBackend :members: ************************** ORT OpenVINO Float Backend ************************** .. autoclass:: enot_lite.backend.ort_openvino.OrtOpenvinoFloatBackend :members: ******************** ORT TensorRT Backend ******************** .. autoclass:: enot_lite.backend.ort_tensorrt.OrtTensorrtBackend :members: ************************** ORT TensorRT Float Backend ************************** .. autoclass:: enot_lite.backend.ort_tensorrt.OrtTensorrtFloatBackend :members: ********************************** ORT TensorRT Optimal Float Backend ********************************** .. autoclass:: enot_lite.backend.ort_tensorrt.OrtTensorrtFloatOptimBackend :members: ************************** ORT TensorRT Int-8 Backend ************************** .. autoclass:: enot_lite.backend.ort_tensorrt.OrtTensorrtInt8Backend :members: