Brainy Pi

Available to select audience (currently beta)

In this blog post, we will explore the AI pipeline for person detection, recognition, and reidentification using OpenVINO on BrainyPi. OpenVINO, short for Open Visual Inference and Neural Network Optimization, is a powerful toolkit by Intel that enables developers to deploy deep learning models efficiently on various hardware platforms. Let’s implement Cross Road Camera Demo on Brainy Pi !
The ability to detect, recognize, and reidentify persons is a crucial component in many computer vision applications, such as surveillance systems, crowd analysis, and personalized marketing. By leveraging OpenVINO’s optimization techniques and the Brainy Pi platform, we can build a robust and efficient solution for person-related tasks.

Installing OpenVINO

To get started, we need to install OpenVINO and its dependencies on BrainyPi. Open a terminal and execute the following command:
sudo apt install openvino-toolkit libopencv-dev
This command will install OpenVINO and the necessary OpenCV development files on your BrainyPi system.

Compiling Demos

Once OpenVINO is installed, we can proceed to compile the demos  by Open Model Zoo. These demos serve as excellent starting points for understanding and exploring the capabilities of OpenVINO. Follow the steps below to compile the demos:
  1. Set up the OpenVINO environment by sourcing the setupvars.sh script:
    source /opt/openvino/setupvars.sh
  2. Clone the Open Model Zoo repository, which contains the demos, by executing the following command:
    git clone --recurse-submodules https://github.com/openvinotoolkit/open_model_zoo.git
    cd open_model_zoo/demos/
  3. Now, build the demos by running the build_demos.sh script:
    ./build_demos.sh
This process will compile the demo applications and make them ready for execution on BrainyPi.

Running Cross Road Camera Demo

With the demos compiled, we can now download the required models and run them using OpenVINO. Follow these steps to run the demos:
  1. Download the models required for the demo by executing the following command:
    omz_downloader --list ~/open_model_zoo/demos/crossroad_camera_demo/cpp/models.lst -o ~/models/ --precision FP16
  2. Download the test video that we will use for demonstration purposes:
    cd ~/
    wget https://raw.githubusercontent.com/intel-iot-devkit/sample-videos/master/people-detection.mp4
  3. Once the models and the test video are downloaded, you can run the object detection demo using the following command:
    ~/omz_demos_build/aarch64/Release/crossroad_camera_demo -i ~/people-detection.mp4 -m ~/models/intel/person-vehicle-bike-detection-crossroad-0078/FP16/person-vehicle-bike-detection-crossroad-0078.xml -m_pa ~/models/intel/person-attributes-recognition-crossroad-0230/FP16/person-attributes-recognition-crossroad-0230.xml -m_reid ~/models/intel/person-reidentification-retail-0287/FP16/person-reidentification-retail-0287.xml
In the above command, we specify the input video, as well as the paths to the downloaded models for person detection, person attributes recognition, and person reidentification. Feel free to adjust these paths based on your specific setup.

Reference

https://docs.openvino.ai/2022.3/omz_demos_crossroad_camera_demo_cpp.html

Conclusion

In this blog post, we have walked through the process of setting up OpenVINO on BrainyPi and using it to build an AI pipeline for person detection, recognition,
and reidentification. By leveraging the power of OpenVINO and the BrainyPi platform, developers and entrepreneurs can integrate these capabilities into their own computer vision products.
The ability to accurately detect and recognize persons opens up numerous possibilities for applications in various domains, including security, retail analytics, and customer experience enhancement. With the step-by-step instructions provided in this blog, you now have a solid foundation to start implementing your own AI-powered computer vision solutions using OpenVINO and BrainyPi.
0 Comments

Leave a reply

Your email address will not be published. Required fields are marked *

*