A Novel Multispectral Fusion Defect Detection Framework With Coarse-to-Fine Multispectral Registration

Jiacheng Li, Bin Gao*, Wai Lok Woo, Jieyi Xu, Lei Liu, Yu Zeng

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review


This article introduces a new imaging approach to nondestructive defect detection by combining visual testing (VT) and infrared thermal testing (IRT) in a multispectral vision sensing fusion system. The goal is to overcome the hampering challenges faced by traditional imaging methods, including complex environments, irregular samples, various defect types, and the need for efficient detection. The proposed system simultaneously detects and classifies surface and subsurface defects, addressing issues, such as false detection due to changes in surface emissivity in IRT and the inability to detect subsurface defects in VT. A novel multispectral fusion defect detection framework is proposed, employing coarse-to-fine multispectral registration for accurate alignment of infrared and visible images with different resolutions and fields of view. Domain adaptation unifies the feature domains of infrared and visible images by replacing the phase components in the frequency domain. The framework utilizes the complementary information from infrared and visible modalities to enhance detection accuracy and robustness. Experimental validation is conducted on different specimens, confirming the effectiveness of the proposed framework in detecting and generalizing to various shapes and materials. Overall, this article presents a novel imaging system that combines VT and IRT, offering improved detection capabilities in complex environments and diverse defect scenarios. The demo code is available at: https://github.com/ljcuestc/YoloMultispectralFusion-Coarse-to-fine-Registration.gi .
Original languageEnglish
Pages (from-to)1-13
Number of pages13
JournalIEEE Transactions on Instrumentation and Measurement
Early online date19 Dec 2023
Publication statusPublished - 2 Jan 2024

Cite this