Artículo disponible también en español. The article is also available in Japanese .

In microscopy this convolution process mathematically explains the formation of an image that is degraded by blurring and noise. The blurring is largely due to diffraction limited imaging by the instrument. The noise is usually photon noise, a term that refers to the inherent natural variation of the incident photon flux.

The degree of spreading (blurring) of a single pointlike (Sub Resolution) object is a measure for the quality of an optical system. The 3D blurry image of such a single point light source is usually called the

PSFs play an important role in the image formation theory of the fluorescent microscope. The reason for this is that in incoherent imaging systems such as fluorescent microscopes the image formation process is linear and described by Linear System theory. This means that when two objects A and B are imaged simultaneously, the result is equal to the sum of the independently imaged objects. In other words: the imaging of A is unaffected by the imaging of B and vice versa.

As a result of the linearity property the image of any object can be computed by chopping up the object in parts, image each of these, and subsequently sum the results. When one chops up the object in extremely small parts, i.e. point objects of varying height, the image is computed as a sum of PSFs, each shifted to the location and scaled according to the intensity of the corresponding point. In conclusion: the imaging in the fluorescent microscope is completely described by its PSF.

For more details read Image Formation.

You can imagine that the image is formed in your microscope by

This process is mathematically described by a convolution equation of the form

(Eq. 1)

where the image

(Eq. 2)

You can interpret equation 2 as follows: the recorded intensity in a VoXel located at of the image {$ g (\vec x) $} arises from the contributions of all points of the object

That means that for each VoXel located at the overlap between the object function

But this can be improved. An important theorem of Fourier theory, the

This means that a convolution can be computed by the following steps:

Because Fourier transforms require a number of operations in the order of N log(N), this is more efficient than the previous integral.

To see how the application of different PSF's affect the imaging of an object, read the Cookie Cutter.

If convolution implies replacing every original (sub-resolution) light source by its correspondent PSF to produce a blurry image, the restoration procedure would go the opposite way, collecting all this spread light and

Mathematically speaking, deconvolution is just solving the abovementioned Eq. 1, where you know the convolved image

The relation in Eq. 3 would seem to imply that it is possible to obtain the object function

(Eq. 4)

where the acquired image

Thus, inverse filtering will never allow us to recover the true object function

Some deconvolution methods (like Blind Deconvolution) try to solve equation 4 without knowing the PSF term

We must go for another solution.

The Huygens Software of Scientific Volume Imaging enables you to obtain a PSF in two ways:

In the second case, given a model of the bead shape, the PSF is computed 'distilled' which its convolution with the bead model matches the measured bead image. That can be understood looking back at figure 1 and equation 1. Now we know how the object

Once a PSF is provided Huygens can use different mathematical algorithms to effectively solve the convolution equation 4 and do deconvolution:

The Classic Maximum Likelihood Estimation (CMLE) is the most general Restoration Method available, valid for almost any kind of images. It is based on the idea of iteratively optimizing the likelihood of an estimate of the object given the measured image and the PSF. The object estimate is in the form of a regular 3D image. The likelihood in this procedure is computed by a Quality Criterion under the assumption that the Photon Noise is governed by Poisson statistics. (Photoelectrons collected by a detector exhibit a Poisson Distribution and have a square root relationship between signal and noise). For this reason it is optimally suited for low-signal images. In addition, it is well suited for restoring images of point- line- or plane like objects. See Maximum Likelihood Estimation for more details.

There are however situations in which other algorithms come to front, for example when deconvolving 3D-time series, which is very compute-intensive. In this case you may consider to use Quick Maximum Likelihood Estimation-time (QMLE) which is much faster than the CMLE-time and will give excellent results as well.

An advantage of using measured PSF as in Huygens is that in essence it requires you to calibrate your microscope, and stimulates the use of standard protocols for imaging. Together, these will ensure correct functioning of the microscope and vastly increase the quality and reliability of the microscopic data itself, and with that of the deconvolution results.

Lastly, an advantage of theoretical or measured PSFs is that they facilitate construction of very fast algorithms like the QMLE in Huygens Professional or the New Batch Processor Tutorial. Iterations in QMLE are about five times more effective than CMLE iterations and require less time per iteration.

Images affected by Spherical Aberration due to a Refractive Index Mismatch are better restored with Huygens Software through the use of depth-dependent PSF's (see Parameter Variation).

Huygens algorithms generally do Intensity Preservation.

See the Huygens restoration applied to some accessible images in Convolving Trains.

The CMLE method used in Huygens is backed up by quite some scientific literature. We mention here just three relevant examples (follow the previous link for a longer list):

A metaphase human cell stained for DNA (red), centromeres (blue) and the anaphase promoting complex/cyclosome (green).

Upper part: original data,

Lower part: deconvolved with Huygens Professional. Recorded by Dr. Claire Acquaviva, Dr. Pines Lab.

Nucleus of a human epithelium cell stained with an antibody against splicing factor.

Top part: image as restored by Huygens Professional.

Bottom part: original image.

Both parts were visualized using the Sfp Renderer. Recorded by Dr. Marjolein A. Grande.

You can also find other images in Resolution Improvement and in Evans Macrophage.

There is a very simple and clarifying example of a deconvolution done with the Huygens Software in Decon Example. For more accessible examples see Convolving Trains. Real microscopy images can be seen in the SVI web page, at http://www.svi.nl/gallery/

See Doing Deconvolution for special topics and references.

Also see the list of Useful Links.

Are you interested in testing the new version of the Huygens software with all its available options? Do not hesitate to download Huygens and request a test license.

# Huygens deconvolution

*The job of image restoration is to figure out what the instrument is actually trying to tell you*. — Prof. E.R. Pike, King's College, London.### Table of contents

## Introduction

**Deconvolution**is a mathematical operation used in Image Restoration to recover an image that is degraded by a physical process which can be described by the opposite operation, a convolution. This is the case in image formation by optical systems as used in microscopy and astronomy, but also in seismic measurements.In microscopy this convolution process mathematically explains the formation of an image that is degraded by blurring and noise. The blurring is largely due to diffraction limited imaging by the instrument. The noise is usually photon noise, a term that refers to the inherent natural variation of the incident photon flux.

The degree of spreading (blurring) of a single pointlike (Sub Resolution) object is a measure for the quality of an optical system. The 3D blurry image of such a single point light source is usually called the

**Point Spread Function**(PSF).## Image formation

PSFs play an important role in the image formation theory of the fluorescent microscope. The reason for this is that in incoherent imaging systems such as fluorescent microscopes the image formation process is linear and described by Linear System theory. This means that when two objects A and B are imaged simultaneously, the result is equal to the sum of the independently imaged objects. In other words: the imaging of A is unaffected by the imaging of B and vice versa.

As a result of the linearity property the image of any object can be computed by chopping up the object in parts, image each of these, and subsequently sum the results. When one chops up the object in extremely small parts, i.e. point objects of varying height, the image is computed as a sum of PSFs, each shifted to the location and scaled according to the intensity of the corresponding point. In conclusion: the imaging in the fluorescent microscope is completely described by its PSF.

For more details read Image Formation.

### Convolution

You can imagine that the image is formed in your microscope by

*replacing*every original Sub Resolution light source by its 3D PSF (multiplied by the correspondent intensity). Looking only at one XZ slice of the 3D image, the result is formed like this:`(Fig. 1)`This process is mathematically described by a convolution equation of the form

(Eq. 1)

where the image

*g*arises from the convolution of the real light sources*f*(the object) and the PSF*h*. The convolution operator * implies an integral all over the space:(Eq. 2)

### Interpretation

You can interpret equation 2 as follows: the recorded intensity in a VoXel located at of the image {$ g (\vec x) $} arises from the contributions of all points of the object

*f*, their real intensities weighted by the PSF*h*depending on the distance to the considered point.### Calculation

That means that for each VoXel located at the overlap between the object function

*f*and the (shifted) PSF*h*must be calculated. Computing this overlap involves computing and summing the value of {$ f(\vec x)\ h(\vec x - \vec x') $} in the entire image. Having N voxels in the whole image, the computational cost is of the order of N^{2}.But this can be improved. An important theorem of Fourier theory, the

**convolution theorem**, states that the Fourier Transforms*G*,*F**H*of*g*,*f*,*h*respectively are related by simply a multiplication:`. (Eq. 3)`This means that a convolution can be computed by the following steps:

- Compute the Fourier transforms
*F*and*H*of*f*and*h* - Multiply
*F*times*H*to obtain*G* - Transform
*G*back to*g*, the convolved image.

Because Fourier transforms require a number of operations in the order of N log(N), this is more efficient than the previous integral.

To see how the application of different PSF's affect the imaging of an object, read the Cookie Cutter.

### Deconvolution

If convolution implies replacing every original (sub-resolution) light source by its correspondent PSF to produce a blurry image, the restoration procedure would go the opposite way, collecting all this spread light and

*putting it back*to its original location. That would produce a better representation of the real object, clearer to our eyes. (This increases the Dynamic Range of the image, and causes the background regions to look darker!!!).Mathematically speaking, deconvolution is just solving the abovementioned Eq. 1, where you know the convolved image

*g*and the PSF*h*, to obtain the original light distribution*f*: an representation of the "real" object.The relation in Eq. 3 would seem to imply that it is possible to obtain the object function

*F*by Inverse Filtering, just by dividing*F*=*G*/*H*. But due to the bandlimited character of*H*it has zeros outside a certain region (see Cookie Cutter), resulting in a division by zero for many spatial frequencies. Also, in a real general case the Photon Noise must be taken into account, so the equation that we actually have to solve is not Eq. 1, but this:(Eq. 4)

`(Fig. 2)`where the acquired image

*k*arises from the convolution of the real light sources*f*and the PSF*h*, plus the photon noise*ε*. The division of ε by*H*would lead to extreme noise amplification and therefore tremendous artifacts due to the large areas of small values of*H*within the passband. (Also, we can not simply substract ε from*k*, as we can not know what is the*exact*noise distribution).Thus, inverse filtering will never allow us to recover the true object function

*f*. Instead, we must try to find an estimate*f*' which satisfies a sensible criterion, and is stable in the presence of noise.Some deconvolution methods (like Blind Deconvolution) try to solve equation 4 without knowing the PSF term

*h*. Although some constraints can be applied, this is always risky, as it introduces a lot of indetermination in the solution of the equation. (How many solutions*x, y*can you find for an algebraical equation of the form*x × y*= 5?) These methods are also currently lacking of any scientific validation when applied to microscopy.We must go for another solution.

## The way Huygens works

The Huygens Software of Scientific Volume Imaging enables you to obtain a PSF in two ways:

- automatically computing a Theoretical Psf based on known Microscopic Parameters and a model of the microscope, or
- distilling an Experimental Psf from spherical bead images, after Recording Beads.

In the second case, given a model of the bead shape, the PSF is computed 'distilled' which its convolution with the bead model matches the measured bead image. That can be understood looking back at figure 1 and equation 1. Now we know how the object

*f*is (the exact size of the spherical bead must be known) and we have acquired its image*g*, thus we can distill the remaining unknown term*h*in the equation.Once a PSF is provided Huygens can use different mathematical algorithms to effectively solve the convolution equation 4 and do deconvolution:

- Classic Maximum Likelihood Estimation
- Quick Maximum Likelihood Estimation
- Iterative Constrained Tikhonov-Miller
- Quick Tikhonov-Miller

The Classic Maximum Likelihood Estimation (CMLE) is the most general Restoration Method available, valid for almost any kind of images. It is based on the idea of iteratively optimizing the likelihood of an estimate of the object given the measured image and the PSF. The object estimate is in the form of a regular 3D image. The likelihood in this procedure is computed by a Quality Criterion under the assumption that the Photon Noise is governed by Poisson statistics. (Photoelectrons collected by a detector exhibit a Poisson Distribution and have a square root relationship between signal and noise). For this reason it is optimally suited for low-signal images. In addition, it is well suited for restoring images of point- line- or plane like objects. See Maximum Likelihood Estimation for more details.

There are however situations in which other algorithms come to front, for example when deconvolving 3D-time series, which is very compute-intensive. In this case you may consider to use Quick Maximum Likelihood Estimation-time (QMLE) which is much faster than the CMLE-time and will give excellent results as well.

An advantage of using measured PSF as in Huygens is that in essence it requires you to calibrate your microscope, and stimulates the use of standard protocols for imaging. Together, these will ensure correct functioning of the microscope and vastly increase the quality and reliability of the microscopic data itself, and with that of the deconvolution results.

Lastly, an advantage of theoretical or measured PSFs is that they facilitate construction of very fast algorithms like the QMLE in Huygens Professional or the New Batch Processor Tutorial. Iterations in QMLE are about five times more effective than CMLE iterations and require less time per iteration.

Images affected by Spherical Aberration due to a Refractive Index Mismatch are better restored with Huygens Software through the use of depth-dependent PSF's (see Parameter Variation).

Huygens algorithms generally do Intensity Preservation.

See the Huygens restoration applied to some accessible images in Convolving Trains.

### Validation

The CMLE method used in Huygens is backed up by quite some scientific literature. We mention here just three relevant examples (follow the previous link for a longer list):

- Verschure P.J., van der Kraan I., Manders E.M.M. and van Driel R.
*Spatial relationship between transcription sites and chromosome territories*. J. Cell Biology (1999)**147**, 1, pp 13-24 (get pdf). - Visser A.E. and Aten J.A.
*Chromosomes as well as chromosomal subdomains constitute distinct units in interphase nuclei*. J. Cell Science (1999)**112**, pp 3353-3360 (get pdf). - Hell S.W., Schrader M. and Van Der Voort H.T.M.
*Far-Field fluorescence microscopy with three-dimensional resolution in the 100-nm Range*. J. of Microscopy (1997)**187**Pt1, pp 1-7 (get pdf).

### Examples

A metaphase human cell stained for DNA (red), centromeres (blue) and the anaphase promoting complex/cyclosome (green).

Upper part: original data,

Lower part: deconvolved with Huygens Professional. Recorded by Dr. Claire Acquaviva, Dr. Pines Lab.

Nucleus of a human epithelium cell stained with an antibody against splicing factor.

Top part: image as restored by Huygens Professional.

Bottom part: original image.

Both parts were visualized using the Sfp Renderer. Recorded by Dr. Marjolein A. Grande.

You can also find other images in Resolution Improvement and in Evans Macrophage.

There is a very simple and clarifying example of a deconvolution done with the Huygens Software in Decon Example. For more accessible examples see Convolving Trains. Real microscopy images can be seen in the SVI web page, at http://www.svi.nl/gallery/

### Further reading

See Doing Deconvolution for special topics and references.

Also see the list of Useful Links.

## Testing Huygens Deconvolution

Are you interested in testing the new version of the Huygens software with all its available options? Do not hesitate to download Huygens and request a test license.