Faster Mobile GPU Inference with OpenCL

Faster Mobile GPU Inference with OpenCL

TensorFlow team announces the official launch of OpenCL-based mobile GPU inference engine for Android, which offers up to ~2x speedup over existing OpenGL backend, on reasonably sized neural networks that have enough workload for the GPU.

devilish