Installation guide
Requirements
arch / distrib |
Python 3.6 |
Python 3.7 |
Python 3.8 |
Python 3.9 |
---|---|---|---|---|
x86_64 |
x |
✓ |
✓ |
✓ |
Jetpack 4.5.1 |
✓ |
✓ |
✓ |
x |
Jetpack 4.6.0 |
✓ |
✓ |
✓ |
x |
Jetpack 4.6.1 |
✓ |
✓ |
✓ |
x |
Jetpack 4.6.2 |
✓ |
✓ |
✓ |
x |
Jetpack 5.0.2 |
x |
✓ |
✓ |
x |
- Other requirements:
ONNX
ONNX Runtime compiled with TensorRT and OpenVINO
Installation
1. Install onnxruntime-gpu package
Install ONNX Runtime compiled with TensorRT and OpenVINO execution providers. This package can be installed from ENOT GitHub repository by using the following command:
wget -O - https://raw.githubusercontent.com/ENOT-AutoDL/ONNX-Runtime-with-TensorRT-and-OpenVINO/master/install.sh | bash
Note
Ignore messages about missing openvino-dev
package dependencies.
There are two types of builds: GPU
and CPU
, if your target device has only CPU
, then you should use CPU
version:
wget -O - https://raw.githubusercontent.com/ENOT-AutoDL/ONNX-Runtime-with-TensorRT-and-OpenVINO/master/install.sh | bash -s -- -t CPU
Warning
In docker build stage, pass -d
option to skip checking the driver version: nvidia-smi
command not found error.
wget -O - https://raw.githubusercontent.com/ENOT-AutoDL/ONNX-Runtime-with-TensorRT-and-OpenVINO/master/install.sh | bash -s -- -d
2. Install ENOT Lite framework
Install ENOT Lite framework from PyPI:
pip install enot-lite
3. License set up
Create hasp_26970.ini
file in $HOME/.hasplm
folder with the following content:
broadcastsearch = 0
serveraddr = 65.109.162.71
disable_IPv6 = 0
This can be done by using the following commands:
mkdir -p $HOME/.hasplm
echo -e 'broadcastsearch = 0\nserveraddr = 65.109.162.71\ndisable_IPv6 = 0' > $HOME/.hasplm/hasp_26970.ini
Now everything is ready for using ENOT Lite.