Skip to main content

ncnn brings nerual network inference acceleration using vulkan

ncnn is a widely-used opensource neural network inference framework optimized for mobile platforms. It supports commonly used CNN models such as resnet, mobilenet, ssd, yolo etc. , converted from caffe, mxnet, onnx etc. The ncnn community used to focus on high performance computing optimization on mobile CPU using arm neon technology. The latest ncnn version added gpu-acceleration via Vulkan compute, which brings a cross-platform and vendor-independent neural network inference solution from desktop to mobile. The ncnn Vulkan acceleration runs natively on Windows, Linux, Android, and MacOS, iOS via MoltenVK project.