With the Mustang-MPCIE-MX2 card, ICP Deutschland expands its portfolio of KI accelerator cards with a Mini PCIe plug-in card variant. The Mini PCIe format enables system integrators to build small embedded PC systems with KI functionality for deep learning inference systems. The KI functionality is provided by two Intel® Movidius™ Myriad™ X MA2485 Vision Processing Units (VPUs). With only 5 watts, the Mustang MPCIE-MX2 is particularly suitable for low-power AI applications without sacrificing performance. The multi-channel capability allows each VPU to be assigned a different DL topology. Thus, calculations can be performed simultaneously, and object recognition and face recognition can be performed simultaneously. With compatibility to the Open Visual Inference Neural Network Optimization (OpenVINOTM) Toolkit from Intel®, various AI training models can be integrated directly at the edge. The OpenVINOTM Toolkit optimizes the performance of the training model and scales it to the target system. Due to the fast and optimized integration, developers as well as customers benefit from lower development costs.
The Mustang MPCIE-MX2 is compatible with popular Linux operating systems such as Ubuntu, CentOS and Windows® 10. Numerous architectures and topologies of neural networks are supported, such as AlexNet, GoogleNet, SqueezeNet and Yolo. In addition to other KI accelerator cards, ICP also offers embedded systems that perfectly match the new Mustang MPCIE-MX2.
Specifications
- KI accelerator card in Mini PCIe form factor
- Two Intel® MovidiusTM MyriadTM X MA2485 VPUs
- Low power consumption of 5 watts
- Operating temperature 0~50 °C
- Powered by Intel® Open VINO Toolkit
Applications
- Acceleration of Deep-Learning Inference Systems
- Object recognition
- Face recognition
- AIoT Edge Computer