#CodingTime – Image denoising using CUDA

Hi all,

today i’m going to talk to you about an academic project i’ve developed: it’s an algorithm for image denoising in parallel, on systems provided with a Nvidia GPU.

If you’re confused about what is an image affected by noise, take a look at here. This project’s aim is to reduce the noise upon the image. The kind of noise we want to reduce is called Gaussian noise, which is a is statistical noise having a probability density function (PDF) equal to that of the normal distribution. It’s very likely to have pictures affected by Gaussian noise when there’s no enough illumination during the acquisition of the photo.

There are several algorithms that are able to reduce the noise on images. The easiest way to do that is to perform a convolution in the spatial domain with a gaussian kernel, this technique does of course what you need but it’s too slow in some cases. In order to do it faster, let’s do it in the frequency domain, thanks to the convolution theorem. I used the  cuFFT library to perform the transformations, CUDA runtime routines to move data between the host memory and the device (GPU) one, and a simple component-wise matrix multiplication kernel for the filtering.

You can find the code on my GitHub profile and if you want, feel free to contribute in any way you want.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s