원문정보
초록
영어
Skeletonization is a crucial step in many digital image processing applications like medical imaging, pattern recognition, fingerprint classification etc. The skeleton expresses the structural connectivities of the main component of an object and is one pixel in width. Present paper covers the aspects of pixel deletion criteria in the skeletonization algorithms needed to preserve the connectivity, topology, sensitivity of the binary images. Performance of different skeletonization algorithms can be measured in terms of different parameters such as thinning rate, number of connected components, execution time etc. Present paper focuses on Peak Signal to Noise Ratio, number of connected components, execution time and Mean Square error on Zhang and Suen algorithm and Guo and Hall algorithm.
목차
1. Introduction
1.1. Need of Skeletonization
1.2. Applications of Skeletonization
2. Survey of Related Work
3. Overview of Skeletonization Algorithms
4. Zhang and Suen and Guo and Hall Algorithm
5. Performance Measures
6. Skeletonization Algorithms
7. Results of ZS in Comparison to GH Algorithm
8. Conclusion and Future Scope
References
