A Pixel-Level Method for Multiple Imaging Sensor Data Fusion through Artificial Neural Networks



Multiple image sensor data fusion is the combination of two or more images from different imaging sensors to improve the performance over each individual image sensor. This paper presents a new pixel-level method of data fusion from multiple image sensors for non-destructive inspection. With this method the images from different sensors were processed and classified using artificial neural networks. The classified images were then fused to produce a resultant image that categorized better than any of the individually classified images. This method was applied to identify the corrosive spots on the aircraft panel specimens. In this application, ultrasonic and eddy current image data ran though artificial neural network classifiers to identify the corroded spots on the same aircraft panel specimen as compared with the benchmark X-ray image. The result indicated that the image data fusion consistently enhanced artificial neural network corrosion detection with eddy current and ultrasonic image data individually in overall and in low corrosion pixels, which are 90 percent of all corrosion pixels, with the improvements over the artificial neural network classification rates of the eddy current image by 12.6% and 12.21% in average for low corrosion and overall corrosion classification, respectively, and over the artificial neural network classification rates of the ultrasonic image by 28.88% and 32.18% in average for low corrosion and overall corrosion classification, respectively. This pixel-level method for multiple imaging sensor data fusion is expected to solve problems of non-destructive inspection in various areas.
Key words: Multisensor Data Fusion; Imaging Sensor; Pixel Level; Artificial Neural Networks; Non-Destructive Inspection


Multisensor Data Fusion; Imaging Sensor; Pixel Level; Artificial Neural Networks; Non-Destructive Inspection


Blum, R. and Z. Liu. (2006). Multi-Sensor Image Fusion and Its Applications. CRC Press.

Forsyth, D. S., Z. Liu, J. P. Komorowski, and D. Peeler. (2002). An application of NDI data fusion to aging aircraft structures. 6th Joint FAA/DoD/NASA Conference on Aging Aircraft, Sept. 16-19, 2002, San Francisco, CA. U.S.A.

Gros, X. E. (1997). NDT Data Fusion. Arnold, London, Great Britain.

Grossberg, S. (1976). Adaptive pattern classification and universal recording: I. parallel development and coding of neural feature detectors. Biological Cybernetics. 23, 121-134.

Han, J. and B. Bhanu. (2007). Fusion of color and infrared video for moving human detection. Pattern Recognition, 40, 1771-1784.

Hebb, D. O. (1949). The Organization of Behavior. John Wiley & Sons, New York.

Hopfield, J. J. (1982). Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences. 79, 2554-2558.

Laliberte, F., L. Gagnon, and Y. Sheng. (2003). Registration and fusion of retinal images-an evaluation study. IEEE Transactions of Medical Imaging. 22(5), 661-673.

Leinonen, I. and H. Jones. (2004). Combining thermal and visible imagery for estimating canopy temperature and identifying plant stress. Journal of Experimental Botany. 55(401), 1423-1431.

McCulloch, W. S., and W. Pitts. (1943). A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysics. 9, 127-147.

Minsky, M. L. and S. A. Papert. (1969). Perceptions. MIT Press, Cambridge, MA.

Palakal, M. J., R. M. Pidaparti, S. Rebbapragada, and C. R. Jones. (2001). Intelligent computational methods for corrosion damage assessment. The American Institute of Aeronautics and Astronautics Journal. 39(10), 1936-1943.

Pohl, C. and J. L. V. Genderen. (1998). Multisensor image fusion in remote sensing: concepts methods and applications. International Journal Remote Sensing, 19(5), 823-854.

Rebbapragada, S., M. J. Palakal, R. M. Pidaparti, and C. R. Jones. (1999). Corrosion detection and quantification using image processing for aging aircraft panels. The Third Joint FAA/DoD/NASA Conference on Aging Aircrafts, Albuquerque, NM. U.S.A.

Rosenblatt, F. (1958). The perceptron: a probabilistic model for information storage and organization in the brain. Psychological Review, 65, 386-408.

Rumelhart, D.E., G. E. Hinton, and R. J. Williams. (1986). Learning internal representations by error propagation. In: D. E. Rumelhart and J. L. McClelland (Eds.), Parallel Distributed Processing: Explorations in the Microstructures of Cognition, vol. I. MIT Press, Cambridge, MA (Chapter 8).

Tsagaris, V. and V. Anastassopoulos. 2005. Fusion of visible and infrared imagery for night color vision. Displays, 26, 191-196.

DOI: http://dx.doi.org/10.3968/j.ans.1715787020110401.001

DOI (PDF): http://dx.doi.org/10.3968/g1775

DOI (indexed/included/archived): http://dx.doi.org/10.3968/g4688


  • There are currently no refbacks.

Copyright (c)

Share us to:   


How to do online submission to another Journal?

If you have already registered in Journal A, then how can you submit another article to Journal B? It takes two steps to make it happen:

1. Register yourself in Journal B as an Author

Find the journal you want to submit to in CATEGORIES, click on “VIEW JOURNAL”, “Online Submissions”, “GO TO LOGIN” and “Edit My Profile”. Check “Author” on the “Edit Profile” page, then “Save”.

2. Submission

Go to “User Home”, and click on “Author” under the name of Journal B. You may start a New Submission by clicking on “CLICK HERE”.

We only use the following emails to deal with issues about paper acceptance, payment and submission of electronic versions of our journals to databases:
caooc@hotmail.com; office@cscanada.net; office@cscanada.org

 Articles published in Advances in Natural Science are licensed under Creative Commons Attribution 4.0 (CC-BY).


Address: 1020 Bouvier Street, Suite 400, Quebec City, Quebec, G2K 0K9, Canada.

Telephone: 1-514-558 6138
Website: Http://www.cscanada.net; Http://www.cscanada.org
E-mail:caooc@hotmail.com; office@cscanada.net

Copyright © 2010 Canadian Research & Development Centre of Sciences and Cultures