Major Announcement 30 Aug 2018

COMING SOON: will migrate to Intel® Developer Zone

What is the Intel® Movidius™ Neural Compute Stick?

The Intel® Movidius™ Neural Compute Stick (NCS) is a tiny fanless deep learning device that you can use to learn AI programming at the edge. NCS is powered by the same low power high performance Intel Movidius Vision Processing Unit (VPU) that can be found in millions of smart security cameras, gesture controlled drones, industrial machine vision equipment, and more.

Develop for High Performance, Low Power Devices

Learn What You Can Do With The NCS

ColdFusion demonstrates the power of the Intel Movidius Neural Compute Stick, which enables rapid prototyping, validation and deployment of Deep Neural Network (DNN) inference applications at the edge. Its low-power VPU architecture enables an entirely new segment of AI applications that are not reliant on a connection to the cloud.

The NCS combined with Intel Movidius Neural Compute SDK allows deep learning developers to profile, tune, and deploy Convolutional Neural Network (CNN) on low-power applications requiring real-time inferencing.

See What's Inside! Introducing the Intel® Movidius™ Myriad™ 2 VPU

The Intel® Movidius™ Myriad™ 2 VPU is the industry's first always-on vision processor. It delivers high-performance machine vision and visual awareness in severely power-constrained environments.

Technical Specifications

Intel® Movidius™ VPU
Supported Frameworks
TensorFlow™, Caffe
USB 3.0 Type-A
USB stick (72.5mm X 27mm X 14mm)
Operating Temp
0° - 40° C
Minimum system requirements
x86_64 computer running Ubuntu* 16.04 or
Raspberry Pi 3 Model B running Stretch desktop or
Ubuntu* 16.04 VirtualBox instance,
USB 2.0 Type-A port (Recommend USB 3.0),
4GB free storage space

Compatible Software

Software Development Kits:

Intel® Open VINO: The OpenVINO™ (Open Visual Inference & Neural network Optimization) toolkit is designed to fast-track development of high-performance computer vision solutions and deliver fast, efficient deep learning workloads across Intel silicon platforms.

Android NN API: The Android Neural Networks API is a new Android C API introduced in Android 8.1 to run computation intensive operations required by most machine learning frameworks (e.g. TensorFlow Lite and Caffe) to build and train neutral networks on Android platforms.

Discover More

Where to Buy

Order your Intel Movidius Neural Compute Stick today from one of our online partners.

More Info

Get Started

Learn how to profile, tune, compile, and deploy your neural networks at the edge.

More Info


Collaborate with fellow developers and get troubleshooting help.

More Info