Abstract:
As a key parameter for an in-depth understanding of growth and health status of fruit trees, chlorophyll content plays an important role in guiding orchard management decisions.However, in a large-scale orchard environment, quickly and accurately obtaining chlorophyll content data for entire orchard is a major challenge.Therefore, a new solution was proposed by using a UAV remote sensing platform combined with deep learning algorithms.RGB and multispectral images of pomegranate tree canopy were collected by multispectral UAV, and an image processing technique was used to extract parameters such as RGB image color features, texture features, and multispectral image vegetation index and establish different datasets.On this basis, combined with ground-measured chlorophyll data, a deep fusion network model CNN-BiGRU combining bidirectional gated recurrent unit(BiGRU)and convolutional neural network(CNN)was constructed and experimentally compared with original CNN and random forest (RF).Experimental results showed that combined model was significantly better than other models in predicting chlorophyll content of pomegranate trees, especially when using feature fusion set modeling, with a determination coefficient as high as
0.9737 and a root mean square error as low as
0.8233.This accuracy met accurate prediction of chlorophyll content of pomegranate trees, providing a practical reference for large-scale orchard management.