Welcome to Francis Academic Press

Academic Journal of Computing & Information Science, 2021, 4(6); doi: 10.25236/AJCIS.2021.040611.

Hyperspectral remote sensing image classification based on random forest

Author(s)

Luo Zhikun1, Tan Kaiyao2

Corresponding Author:
Luo Zhikun
Affiliation(s)

1Hunan University of Science and Technology, School of Resource & Environment and Safety Engineering, Xiangtan 411201, China

2Guizhou University, School of Computer Science and Technology, Guizhou 550025, China

Abstract

With the development of remote sensing technology and machine learning, the research on hyperspectral remote sensing image classification has also progressed rapidly. In this paper, based on the random forest model, a new classification model of hyperspectral remote sensing images is proposed, which can effectively classify the spectral information and texture information of hyperspectral images. First, the texture features are extracted from the hyperspectral remote sensing image data and superimposed on the original spectral domain data, thus forming a new spectral-space domain data. Then, the model forms an optimal combination of parameters by selecting parameters such as the number of decision trees, maximum depth and minimum number of leaves, and this model has a higher classification accuracy for the dataset than the original random forest model. Experimental results on Indian, KSC and Salinas hyperspectral images show that the method proposed in this paper can solve the problems of hyperspectral data nonlinearity and information redundancy due to the expansion of data and the selection of parameter combinations, thus improving the classification effect of the original random forest and the average accuracy of the classification.

Keywords

hyperspectral remote sensing, texture features, spectral-space domain, random forest model, optimal combination of parameters

Cite This Paper

Luo Zhikun, Tan Kaiyao. Hyperspectral remote sensing image classification based on random forest. Academic Journal of Computing & Information Science (2021), Vol. 4, Issue 6: 67-71. https://doi.org/10.25236/AJCIS.2021.040611.

References

[1] Tong Q X, Zhang B and Zheng L F. Hyperspectral Remote Sensing: the Principle, Technology and Application [M]. Beijing: Higher Education Press, 2006a.

[2] Kruse F A, Boardman J W and Huntington J F. Comparison of airborne hyperspectral data and EO-1 Hyperion for mineral mapping [J]. IEEE Transactions on Geoscience and Remote Sensing, 2003, 41(6): 1388-1400

[3] Yao Y J, Qin Q M, Zhang Z L, et al. Research progress of hyperspectral technology applied in agricultural remote sensing [J]. Transactions of the CSAE, 2008, 24(7): 301-306.

[4] Jiang D M. Approach to High Spectral Resolution Infrared Remote Sensing of Atomospheric Temperature and Humidity Profiles [D]. Nanjing University of Information Science and Technology, 2007.

[5] Xu J J, Zhao H. Research on Feature Fxtraction and Flassification of Fyperspectral Femote Fensing Image-based on Discrete Cosine Transform (DCT) and Support Vector Machine Technology [J]. Journal of Jiamusi University (Natural Science Edition), 2006(04): 468-470+475.

[6] Guo C Y. Hyperspectral Imagery classification based on support vector machine [D]. Harbin Engineering University, 2007.

[7] Wang X L. Ensemble methods for Spectral-space classification of urban hyperspectral data [D]. Jilin University, 2010.

[8] Song X F, Jiao L C. Classification of Hyperspectral Remote Sensing Image Based on Sparse Representation and Spectral Information [J]. Journal of Electronics and Information, 2012, 34(02): 268-272.

[9] Tan Y M, Xia W. Optimum Band Combination Based Hyperspectral Remote Sensing Image Classification [J]. Mapping and space Geographic Information, 2014, 37(04): 19-22.

[10] Liu J M, Luo F L, Huang H, Liu Y Z. Classification of Hyperspectral remote sensing images using correlation neighbor LLE [J]. Optical Precision Engineering, 2014, 22(06): 1668-1676.

[11] Li L, Ren Y M. Classification of hyperspectral data based on random forest [J]. Computer Engineering and Applications, 2016, 52(24): 189-193