OpenVINO

Inference Engine SamplesOpen Model Zoo Demos
sample applications are available in the following directories, respectively:
<INSTALL_DIR>/inference_engine/samples/c
<INSTALL_DIR>/inference_engine/samples/cpp
<INSTALL_DIR>/inference_engine/samples/python
demos are available after installation in the following directory: <INSTALL_DIR>/deployment_tools/open_model_zoo/demos

demos can also be obtained from the Open Model Zoo GitHub repository

C++, C++ G-API and Python* versions are located in the cppcpp_gapi and python subdirectories respectively
Inference Engine sample applications include the following:
https://docs.openvino.ai/2021.4/openvino_docs_IE_DG_Samples_Overview.html#

Check out the documentation after Preparing for Running them to run the samples
Open Model Zoo includes the following demos
https://docs.openvino.ai/2021.4/omz_demos.html

Check out the documentation after Preparing for Running them to run the demos
To build the C or C++ sample applications for Linux, go to the sample applications directory and run the build_samples.sh script

Once the build is completed, you can find sample binaries in the following folders:
C samples: ~/inference_engine_c_samples_build/intel64/Release
C++ samples: ~/inference_engine_cpp_samples_build/intel64/Release

Build the Sample Applications on Microsoft Windows* OS

To build the C or C++ sample applications for Linux, go to the sample applications directory and run the build_samples_msvc.bat script

the script automatically detects the highest Microsoft Visual Studio version

Once the build is completed, you can find sample binaries in the following folders:
C samples: C:\Users\<user>\Documents\Intel\OpenVINO\inference_engine_c_samples_build\intel64\Release
C++ samples: C:\Users\<user>\Documents\Intel\OpenVINO\inference_engine_cpp_samples_build\intel64\Release
To be able to build demos you need to source Inference Engine and OpenCV environment from a binary package which is available as proprietary distribution.

Please run the following command before the demos build (assuming that the binary package was installed to <INSTALL_DIR>):
source <INSTALL_DIR>/deployment_tools/bin/setupvars.sh

You can also build demos manually using Inference Engine built from the 
openvino repo.

In this case please set InferenceEngine_DIR environment variable to a folder containing InferenceEngineConfig.cmake and ngraph_DIR to a folder containing ngraphConfig.cmake in a build folder.

Please also set the OpenCV_DIR to point to the OpenCV package to use.

The same OpenCV version should be used both for Inference Engine and demos build.

Alternatively these values can be provided via command line while running cmake.

See CMake’s search procedure. Please refer to the Inference Engine build instructions for details.

Please also add path to built Inference Engine libraries to LD_LIBRARY_PATH (Linux*) or PATH (Windows*) variable before building the demos.

///
Build the Demo Applications on Linux

go to the directory with the build_demos.sh script and run it

Build the Demos Applications on Microsoft Windows* OS

go to the directory with the build_demos_msvc.bat batch file and run it

the script automatically detects the highest Microsoft Visual Studio version
N/ABuild Specific Demos

follow the instructions for building the demo applications above, but add --target <demo1> <demo2> ... to the cmake --build command or --target="<demo1> <demo2> ..." to the build_demos* command.

Note, cmake --build tool supports multiple targets starting with version 3.15, with lower versions you can specify only one target.

///

For Linux*:

cmake DCMAKE_BUILD_TYPE=Release <open_model_zoo>/demos cmake build . target classification_demo segmentation_demo

or

build_demos.sh target=“classification_demo segmentation_demo”

For Microsoft Windows* OS:

cmake A x64 <open_model_zoo>/demos cmake build . config Release target classification_demo segmentation_demo

or

build_demos_msvc.bat target=“classification_demo segmentation_demo”
Preparing for Running the Sample Applications on Linux

Run the setupvars script to set all necessary environment variables:
source <INSTALL_DIR>/bin/setupvars.sh

OpenVINO environment variables are removed when you close the shell, you can permanently set the environment variables as follows:

Open the .bashrc file in <user_home_directory> :
vi <user_home_directory>/.bashrc

Add this line to the end of the file:
source /opt/intel/openvino_2021/bin/setupvars.sh

Save and close the file: press the Esc key, type :wq and press the Enter key.

To test your change, open a new terminal. You will see [setupvars.sh] OpenVINO environment initialized.

You are ready to run sample applications.

Preparing for Running the Sample Applications on Windows

set all necessary environment variables:
<INSTALL_DIR>\bin\setupvars.bat

You are ready to run sample applications.

///

To debug or run the samples on Windows in Microsoft Visual Studio, make sure you have properly configured Debugging environment settings for the Debug and Release configurations. Set correct paths to the OpenCV libraries, and debug and release versions of the Inference Engine libraries. For example, for the Debug configuration, go to the project’s Configuration Properties to the Debugging category and set the PATH variable in the Environment field to the following:

PATH=<INSTALL_DIR>\deployment_tools\inference_engine\bin\intel64\Debug;<INSTALL_DIR>\opencv\bin;%PATH%

where <INSTALL_DIR> is the directory in which the OpenVINO toolkit is installed.

You are ready to run the samples.
Preparing for Running the Sample Applications on Linux

run the setupvars script to set all necessary environment variables:
source <INSTALL_DIR>/bin/setupvars.sh

OpenVINO environment variables are removed when you close the shell, you can permanently set the environment variables as follows:

Open the .bashrc file in <user_home_directory> :
vi <user_home_directory>/.bashrc

Add this line to the end of the file:
source <INSTALL_DIR>/bin/setupvars.sh

Save and close the file: press the Esc key, type :wq and press the Enter key.

To test your change, open a new terminal. You will see [setupvars.sh] OpenVINO environment initialized.

”””””’
run Python demo applications that require native Python extension modules, you must additionally set up the PYTHONPATH environment variable as follows, where <bin_dir> is the directory with the built demo applications:
export PYTHONPATH=“$PYTHONPATH:<bin_dir>/lib”

You are ready to run demo applications.

Preparing for Running the Sample Applications on Windows

set all necessary environment variables:
<INSTALL_DIR>\bin\setupvars.bat

To run Python demo applications that require native Python extension modules, you must additionally set up the PYTHONPATH environment variable as follows, where <bin_dir> is the directory with the built demo applications:

set PYTHONPATH=%PYTHONPATH%;<bin_dir>

You are ready to run demo applications.
///

To debug or run the demos on Windows in Microsoft Visual Studio, make sure you have properly configured Debugging environment settings for the Debug and Release configurations. Set correct paths to the OpenCV libraries, and debug and release versions of the Inference Engine libraries. For example, for the Debug configuration, go to the project’s Configuration Properties to the Debugging category and set the PATH variable in the Environment field to the following:

PATH=<INSTALL_DIR>\deployment_tools\inference_engine\bin\intel64\Debug;<INSTALL_DIR>\opencv\bin;%PATH%

where <INSTALL_DIR> is the directory in which the OpenVINO toolkit is installed.

You are ready to run the demo applications.

Windows

Linux

  • For Ubuntu, CentOS & Yocto

  1. Follow this to setup OpenVINO
WindowsLinux
Known CompatibilityWindows 10Ubuntu 18.04.x long-term support (LTS), 64-bit

Ubuntu 20.04.0 long-term support (LTS), 64-bit

CentOS 7.6, 64-bit (for target only)

Yocto Project v3.0, 64-bit (for target only and requires modifications)

For deployment scenarios on Red Hat* Enterprise Linux* 8.2 (64 bit), you can use the of Intel® Distribution of OpenVINO™ toolkit run-time package that includes the Inference Engine core libraries, nGraph, OpenCV, Python bindings, CPU and GPU plugins. The package is available as:
Downloadable archive
PyPi package
Docker image
Install LocationC:\Program Files (x86)\Intel\openvino_<version>, referred to as <INSTALL_DIR>For root or administrator: /opt/intel/openvino_<version>/

For regular users: /home/<USER>/intel/openvino_<version>/
Model Tools Location<INSTALL_DIR>\deployment_tools/opt/intel/openvino_<version >/deployment_tools
Contains demos to verify OpenVINO’s workability after installing
To check, run: demo_security_barrier_camera.bat
<INSTALL_DIR>\ deployment_tools\demo /opt/intel/openvino_<version >/deployment_tools/demo
Model files (#1 & #3 have exact same files stored in different locations)#1 – <INSTALL_DIR> \deployment_tools\open_model_zoo\models\intel
# 2 – <INSTALL_DIR> \deployment_tools\intel_models
# 3 – <INSTALL_DIR> \deployment_tools\open_model_zoo\models\public
# 1 – /opt/intel/openvino_ <version > /deployment_tools/open_model_zoo/models/intel
# 2 – /opt/intel/openvino_ <version > /deployment_tools/intel_models
# 3 – /opt/intel/openvino_ <version > /deployment_tools/open_model_zoo/models/public
Demos/Samples (both Demos/Samples are the same kind; they have *.c, *.cpp & *.py files) <INSTALL_DIR> \deployment_tools\open_model_zoo\demos
<INSTALL_DIR> \deployment_tools\inference_engine\demos <– shortcut to open_model_zoo’s demos folder <INSTALL_DIR> \deployment_tools\inference_engine\samples
/opt/intel/openvino_ <version > /deployment_tools\open_model_zoo\demos
/opt/intel/openvino_ <version > /deployment_tools\inference_engine\samples
CC++ Demo/Sample *.exe (built from Demos/Samples cpp directory)\Users\<User>\Documents\Intel\OpenVINO\omz_demos_build\intel64\Release
Model Optimizer (mo.py) <INSTALL_DIR> \deployment_tools\model_optimizer
python3 mo.py –input_model INPUT_MODEL –output_dir
Initialize environmentcd <INSTALL_DIR> \bin
setupvars.bat
cd /opt/intel/openvino_ <version > /bin
./setupvars.sh

  1. First, the model such as Caffe’s <INPUT_MODEL>.caffemodel and TensorFlow’s <INFERENCE_GRAPH>.pb is done through a framework such as TensorFlow is prepared. 
  2. Next, the model is converted using OpenVINO’s Model Optimizer to convert models using the mo.py script to Immediate Representation (IR) (a.k.a. graph) files (*.xml & *.bin) for inference operations. 
    1. IR repository
  3. Once converted, the IR files are able to be run by the Inference Engine (software libraries that run inference against IR).
    • Prior to IR inference, the environment has to be initialized by running setupvars.bat.
    • For C++, ?????????????????????????????
    • For python, use *.py file to infer. For example:
      • python <path_to_sample>/classification_sample_async.py -m <path_to_model>/alexnet.xml -i <path_to_image>/car.bmp /cat.jpg -d GPU

Reference

  1. “Get Started with OpenVINO™ Toolkit on Windows* – OpenVINO™ Toolkit.” OpenVINO, docs.openvinotoolkit.org/latest/openvino_docs_get_started_get_started_windows.html.
  2. “Get Started with OpenVINO™ Toolkit on Linux* – OpenVINO™ Toolkit.” OpenVINO, docs.openvinotoolkit.org/latest/openvino_docs_get_started_get_started_linux.html.
  3. https://docs.openvino.ai/2021.4/openvino_docs_IE_DG_Samples_Overview.html#
  4. https://docs.openvino.ai/2021.4/omz_demos.html

--- :: Skyferia Tech's Related Posts :: ---