Performance Measurement Metrics for images
Download
Report
Transcript Performance Measurement Metrics for images
Performance Measurement of Image
Processing Algorithms
By
Dr. Rajeev Srivastava
ITBHU, Varanasi
Performance Measurement:
Image Reconstruction
• Image Reconstruction
In various image applications, where an image is to be reconstructed, from its
degraded version, the performance of the image processing algorithms need to be
evaluated quantitatively. For evaluation purposes, we must have the original image.
Some examples of such image processing algorithms include:
• Image Restoration (Where original image is available for comparison purposes)
• Image Enhancement
• Image Compression
• Image Reconstruction (Tomographic Reconstruction etc.)
• Image Interpolation/ Zooming
• Image Inpainting
etc.
Performance Measures:
Image Reconstruction
Some Performance Measures for evaluating previously mentioned Image processing
algorithms where the original image and reconstructed image from its degraded
version is available for evaluation purposes are listed as follows:
-Mean Square Error (MSE)
-Root Mean Square Error (RMSE)
-Peak Signal-to-Noise Ratio (PSNR)
-Mean Absolute Error (MAE)
-Cross Correlation Parameter (CP)
-Mean Structure Similarity Index Map (MSSIM)
- Histogram Analysis
Performance Measures:
Image Reconstruction
For those cases where original image is
not available for comparison purposes,
such as for Blind Restoration of Images
and Image enhancement following
performance measures can be used:
• Blurred Signal-to-Noise Ratio (BSNR)
•
Mean square error:
2
1 m n '
MSE
I (i, j ) I (i, j )
m n i 1 j 1
Where I is the original image and I’ is the reconstructed image.
•
Root mean square error:
RMSE MSE
•
Peak signal-to-noise ratio:
255
PSNR 20 log 10
RMSE
For optimal performance, measured values of MSE, RMSE and should be small and PSNR should be
large.
•
Correlation parameter (CP):
CP is a qualitative measure for edge preservation. If one is interested in suppressing noise or
artefacts while at the same time preserving the edges of the original image then this parameter
proposed by (M. Salinas, H. and Fernandez, Delia C., 2007) can be used. To evaluate the
performance of the edge preservation or sharpness, the correlation parameter is defined as
follows
m
n
^
^
(I I ) ( I I )
i 1 j 1
CP
m
n
m
n
^
^
(I I ) ( I I )
i 1 j 1
2
2
i 1 j 1
^
Where I and I^ are high pass filtered versions of original image I and filtered image I
^
obtained via a 3x3 pixel standard
approximation
of
the
Laplacian
operator.
The
and
I are
I
^
the mean values of I and I , respectively. The correlation parameter should be closer to unity
for an optimal effect of edge preservation.
Performance measurement metrics......
•
Structure similarity index map (SSIM):
SSIM is used to compare luminance, contrast and structure of two different images. It
can be treated as a similarity measure of two different images. SSIM of two images X
and Y can be defined as
(2 x y C ) (2 xy C )
SSIM ( X ,Y )
1
2
2
2
2
2
( x y C1) ( x y C2 )
Where (i = X or Y) is the mean intensity, (i=X or Y) is the standard deviation, .
and Ci (i=1 or 2) is the constant to avoid instability when is very close to zero
and is defined as C (k L) in which k 1 and L is the dynamic range of pixel values e.g.
L=255 for 8-bit gray scale image. In order to have an overall quality measurement of
the entire image, mean SSIM is defined as
i
xy
i
2
x
2
y
2
i
i
i
m n
MSSIM ( X ,Y ) 1 SSIM ( X ij ,Yij )
mn i1 j 1
The MSSIM value should be closer to unity for optimal measure of similarity.
• Normalized Mean Square Error :
m n '
(I (i, j) I (i, j))2
i1 j 1
NMSE
m n 2
I (i, j)
i1 j 1
x
y
Example :Histogram Analysis
Figure Histogram analysis for 4x4 image interpolation a) original image b) nearest neighbour interpolation c) bilinear interpolation d) bicubic
interpolation e) anisotropic diffusion method f) proposed method, mri_head_2.jpg (256x256)
Performance Measures: Speckle Reduction Algorithms
For measuring the performance of speckle
reduction algorithms which are responsible for
reducing speckle/ multiplicative noise following
performance measures can be used to evaluate
the algorithms:
• Speckle Index (SI)
Effective Number
• Average Signal-to-Noise
of Looks (ENL)
Ratio (Average SNR)
•
Speckle index :
Since speckle noise is multiplicative in nature, average contrast of an image may be
treated as a measure of speckle removal. Speckle index (SI) is defined as
SI
var(I )
E(I )
and its discrete version for an image reads
m n (i, j)
SI 1
mn i1 j 1 (i, j)
where m n is the size of the image, is the mean and is the standard deviation.
For optimal performance, the measured value of S.I. should be low.
•
The speckle index can be regarded as an average reciprocal signal-to noise ratio (SNR)
with the signal being the mean value and noise being the standard deviation.
Average SNR=1/SI.
•
Effective number of looks (ENL):
The number of looks in an intensity image is a measure of the statistical fluctuations
introduced by speckle resulting from the interference between randomly positioned
scatterer. Thus ENL gives essentially an idea about the smoothness in the regions on
the image that is supposed to have a homogeneous appearance but are corrupted by
noise. ENL is generally defined as
ENL
t2
t2
Where t denotes the target area or region of interest, t and t are the pixel mean
and standard deviation of a target area of the image. In this work, target area is the
whole image.
A large value of ENL reflects better quantitative performance of the filter.
• Edge Detection Error Rate
If n0 is the number of edge pixels declared and n1 is the number of missed or new edge
pixels after adding noise. If n0 is held fixed for noiseless as well as noisy images, then the
edge detection error rate (Pe) is defined as:
Pe
•
n1
n0
Another measure for the noise performance of edge detection operators is given by
the quantity:
ND
1
1
P
max( N1 , N D ) i 1 1 d i2
Where d i is the distance between a pixel declared as edge and the nearest ideal edge
pixel, is calibration constant, and N1 and ND are the number of ideal and
detected edge pixels respectively.
Performance Measures:
Image Segmentation Algorithms
The performance of the segmentation algorithm can be evaluated by
obtaining three segmentation performance measures namely:
Probabilistic Rand Index (PRI) [Unnikrishnan. R et al (2007)]
Variation of Information (VOI) [Meila M. (2005)]
Global Consistency Error (GCE) [Martin D.et al (2001)]
with the sample images.
Performance Measures: Segmentation Algorithm
REFERENCES
• Meila M. (2005), “Comparing Clustering – An axiomatic view”, in proc.
22nd Int. Conf. Machine Learning, pp. 577-584.
• Unnikrishnan R., Pantofaru C., and Hernbert M. (2007), “Toward objective
evaluation of image segmentation algorithms,” IEEE Trans. Pattern Annl.
Mach. Intell, Vol.29, No.6, pp. 929-944.
• Martin D. Fowlkes C., Tal D. and Malik J. (2001), “ A database of human
segmented natural images and its application to evaluating segmentation
algorithms and measuring ecological statist ics,” in proc. 8th Int.
Conference Computer vision, vol.2, pp.416- 423
Classification and Analysis
• Feature Evaluation
For evaluating the feature extraction and classification algorithms following measures are used:
Error = (100/40) * (con (1, 2) +con (2, 1))
Accuracy = 100 – Error
Above mentioned measures are derived from the following Confusion Matrix.
• Example: Selecting useful features is important in cancer analysis/ classification
(malignant/benign classification). Some important features can be extracted and evaluated
for analysis to get to our goal. We can supply a matrix with 20*100 dimension, which means
20 images and 100 features that each row of the matrix represents one feature in specific
angle and distance. Then we can organize the matrix by Kmeans clustering method to
separate 2 groups of malignant and benign tumors
• For the classification of benign and malignant tumor for cancer detection, a 2*2 confusion
matrix (con) can be formulated where position (1, 1) shows the true classification of benign,
position (2, 2) represents the true classification of malignant, position (1, 2) wrongly
distinguishes malignant instead of benign and position (2, 1) wrongly distinguishes benign
instead of malignant tumors.
Classification and analysis
Confusion Matrix
Position (1,1) i.e.
Con (1,1) shows the
true classification of
benign
Position (1, 2) i.e. Con
(1,2) wrongly
distinguishes
malignant instead of
benign
Position (2, 1) i.e.
Con (2, 1) wrongly
distinguishes benign
instead of malignant
Position (2, 2) i.e. Con
(2,2)
represents the true
classification of malignant
END