There is a tutorial on Python API code in OpenVINO using Raspberry Pi and Neural Compute Stick on the here site, but if you want to do it in a PC-only environment, I need to make some changes and additions, so I will write a note of it. Image source: Deep learning inference learned from scratch with OpenVINO ™
The following three items are required
-Change target device -Changed FP16 to FP32 -Add library load
Just change from MYRAID
, which means Neural Compute Stick, to CPU
. This is easy.
#Change before
plugin = IEPlugin(device="MYRIAD")
#After change
plugin = IEPlugin(device="CPU")
Change the trained model (xml and bin files) from FP16 to FP32
For example, face-detection-retail-0004
for OpenVINO 2019_R3.1 can be downloaded from FP32 on the following site.
Intel Open Source Technology Center
By the way, it may work with FP16 as it is, but FP32 is recommended, so it is better to change it. Image source: Intel OpenVINO ™ TOOLKIT official website
Depending on the trained model you use, you may get the following errors (such as face recognition):
Traceback (most recent call last):
File "XXXXXXXX.py", line XXX, in <module>
exec_net = plugin.load(network=net)
File "ie_api.pyx", line 547, in openvino.inference_engine.ie_api.IEPlugin.load
File "ie_api.pyx", line 557, in openvino.inference_engine.ie_api.IEPlugin.load
RuntimeError: Unsupported primitive of type: PriorBoxClustered name: fc7_mbox_priorbox
As a response, it is necessary to add CPU Extensions library reading. The path and file name differ depending on the OS and CPU, so please try various things.
#Add under target device specification
plugin.add_cpu_extension("C:/Program Files (x86)/IntelSWTools/openvino/deployment_tools/inference_engine/bin/intel64/Release/cpu_extension_avx2.dll")
cpu_extension.dll
and specify the full path.#Add under target device specification
plugin.add_cpu_extension('/opt/intel/openvino/deployment_tools/inference_engine/lib/intel64/libcpu_extension_sse4.so')
libcpu_extension_avx2.so
.#Add under target device specification
plugin.add_cpu_extension('/opt/intel/openvino/deployment_tools/inference_engine/lib/intel64/libcpu_extension.dylib')
With this, if you have a PC, you can make deep learning inference for free! see you.
Recommended Posts