Contains the OpenVINO(TM) toolkit for hardware acceleration of deep learning inference for computer vision applications.
Harness the performance of Intel®-based accelerators for deep learning inference with the CPU, GPU, and VPU included in this kit.
The kit includes a field-ready mountable aluminum enclosure and camera. Purchase available Wi-Fi and cellular module add-ons.
Access Intel computer vision accelerators. Speed code performance.Supports heterogeneous processing & Asynchronous execution.
Unleash convolutional neural network (CNN) based deep learning inference across using a common API & 10 trained models.
Reduce time using a library of optimized OpenCV* & OpenVX* functions, 15+ samples.Develop once, deploy for current & future Intel-based devices.
Use the increasing repository of OpenCL™ starting points in OpenCV* to add your own unique code.