Fast Gaussian blur in real time on GPU

Gaussian filtering is widely used standard algorithm which is a must in many applications, starting from Sharp/USM to SIFT/SURF. Gauss filter is isotropic and separable. These properties are very important for fast and efficient image processing. Gaussian filtering usually is time-consuming task, that's why it's a good idea try to accelerate it on GPU.

Standard features for Gaussian filtering on NVIDIA GPUs

  • Data input: 8/16 or 24/48-bit images in CPU or GPU memory
  • Data output: final image in CPU or GPU memory
  • Parameters: sigma (radius blur)
  • Optimized for the latest NVIDIA GPUs
  • Compatible with Windows-7/8/10 and Linux Ubuntu/CentOS

Benchmarks for Gaussian blur on GeForce GTX 980 (Windows-10 and CUDA-10, 64-bit)

Now we need just ~8 ms for Gaussian blur (sigma ~1, window 5×5) of 24-bit color image with 3840×2160 resolution. These are benchmarks for 2K / 4K images, 24-bit (computations on GPU, without DeviceIO latency)

  • Full HD (2K, 1920×1080) ~ 2.4 GByte/s
  • 4K (3840×2160) ~ 3 GByte/s

Contact Form

This form collects your name and email. Check out our Privacy Policy on how we protect and manage your personal data.