Installation guide

Requirements

  • Python 3.6 or higher

  • ONNX

  • ONNX Runtime compiled with TensorRT and OpenVINO

Installation

1. Install onnxruntime-gpu package

Install ONNX Runtime compiled with TensorRT and OpenVINO execution providers. This package can be installed from ENOT GitHub repository by using the following command:

wget -O - https://raw.githubusercontent.com/ENOT-AutoDL/ONNX-Runtime-with-TensorRT-and-OpenVINO/master/install.sh | bash

Note

Ignore messages about missing openvino-dev package dependencies.

There are two types of builds: GPU and CPU, if your target device has only CPU, then you should use CPU version:

wget -O - https://raw.githubusercontent.com/ENOT-AutoDL/ONNX-Runtime-with-TensorRT-and-OpenVINO/master/install.sh | bash -s -- -t CPU

Warning

In docker build stage, pass -d option to skip checking the driver version: nvidia-smi command not found error.

wget -O - https://raw.githubusercontent.com/ENOT-AutoDL/ONNX-Runtime-with-TensorRT-and-OpenVINO/master/install.sh | bash -s -- -d

2. Install ENOT Lite framework

Install ENOT Lite framework from PyPI:

pip install enot-lite

3. License set up

Create hasp_26970.ini file in $HOME/.hasplm folder with the following content:

broadcastsearch = 0
serveraddr = 212.20.47.88
disable_IPv6 = 0

This can be done by using the following commands:

mkdir -p $HOME/.hasplm
echo -e 'broadcastsearch = 0\nserveraddr = 212.20.47.88\ndisable_IPv6 = 0' > $HOME/.hasplm/hasp_26970.ini

Now everything is ready for using ENOT Lite.