# Quality stop criterion

The quality stop criterion used in the Maximum Likelihood Estimation (MLE) algorithms is based on Csiszár's I-divergence (information divergence, also called relative entropy or Kullback-Leibler distance) measure [1, 2, 3]. Briefly put, this is an optimal measure to characterize the difference between two non-negative images with a single number.

The quality factor reported by the Huygens Software is the ratio of the I-divergence between the first (imaged) estimate and the measured image divided by the I-divergence between the current imaged estimate and the measured image.

In the Iterative Constrained Tikhonov Miller algorithm it is derived from the Tikhonov-Miller functional as described in the literature.

### Quality factor interpretation

The absolute value of the final Quality factor much depends on the data, the microscope type, and the background. It is a global value computed over the entire image, so the contribution of a local resolution increase can be small.
For example, suppose you have a large featureless image with one tiny object. While the tiny object may be restored very well, the change in the featureless part is negligible. The quality factor will therefore hardly change, though the restoration is successful.

The quality factor can therefore only be used in a relative way, to compare between iterations, and is used for the stop criterion in the Quality Change Threshold parameter. You cannot compare these values between different images.

As a rule widefield images with much signal show a much higher quality increase than for example confocal images which have much less signal due to the pinhole.

While doing a deconvolution in the Huygens deconvolution wizard the value is usually already set in decimals (e.g. 0.1 or 0.01). In the web-driven HRM you must add this manually.

References:
[1] Csiszár I. (1991) Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems. Ann. Stat., 19, No. 4, pp. 2033-2066.
[2] http://mathworld.wolfram.com/RelativeEntropy.html
[3] http://en.wikipedia.org/wiki/Kullback-Leibler_divergence

See also the FAQ MLE vs ICTM - Which method is more effective under certain circumstances?.

Keywords: Csiszar, Csiszár.