Quantifying the similarity of 2D images using edge pixels: an application to the forensic comparison of footwear impressions
We propose a novel method to quantify the similarity between an impression (Q) from an unknown source and a test impression (K) from a known source. Using the property of geometrical congruence in the impressions, the degree of correspondence is quantified using ideas from graph theory and maximum clique (MC). The algorithm uses the x and y coordinates of the edges in the images as the data. We focus on local areas in Q and the corresponding regions in K and extract features for comparison. Using pairs of images with known origin, we train a random forest to classify pairs into mates and non-mates. We collected impressions from 60 pairs of shoes of the same brand and model, worn over six months. Using a different set of very similar shoes, we evaluated the performance of the algorithm in terms of the accuracy with which it correctly classified images into source classes. Using classification error rates and ROC curves, we compare the proposed method to other algorithms in the literature and show that for these data, our method shows good classification performance relative to other methods. The algorithm can be implemented with the R package shoeprintr.
This is a manuscript of an article published as Park, Soyoung, and Alicia Carriquiry. "Quantifying the similarity of 2D images using edge pixels: an application to the forensic comparison of footwear impressions." Journal of Applied Statistics (2020): 1-28. Posted with permission of CSAFE.