As vegetation's role in landscape architecture, carbon sequestration, and biodiversity conservation gains increasing recognition, the need for accurate and efficient plant data acquisition has become urgent. Traditional ground-based vegetation monitoring methods are labor-intensive and suffer from limited spatiotemporal coverage, whereas unmanned aerial vehicle (UAV) technology has emerged as an innovative alternative, enabling cost-effective, high-resolution, and operationally flexible aerial surveys. Equipped with diverse sensors such as LiDAR, RGB cameras, multispectral/hyperspectral sensors, and thermal imaging devices, UAVs facilitate high-frequency, high-precision vegetation monitoring across diverse landscapes. This study reviews advancements in UAV-based plant information acquisition by synthesizing key technologies, methodologies, applications, and future research directions through bibliometric and content analyses of literature from the CNKI and Web of Science databases (2014–2024), with a focus on applications in ecology, agriculture, forestry, and landscape heritage conservation.
Temporal analysis of publication trends reveals a significant increase in UAV-related vegetation studies since 2016, driven by advancements in artificial intelligence, remote sensing, and civil drone technology. Research hotspots have gradually shifted from basic photogrammetric modeling and spectral analysis to advanced algorithmic integration, emphasizing intelligent, scalable, and real-time applications. Emerging frontiers include UAV swarm coordination, AI-driven autonomous flight control, and sustainable drone operations.
The research demonstrates that UAVs equipped with integrated sensors have significantly enhanced the accuracy and efficiency of vegetation data acquisition. LiDAR plays a vital role in retrieving structural parameters such as canopy height, tree diameter, and crown width through dense 3D point clouds, while multispectral and hyperspectral sensors extract physiological and biochemical indicators, including leaf area index, chlorophyll content, and nitrogen status. RGB imagery is extensively applied in vegetation texture recognition, photogrammetric modeling, and orthophoto generation, whereas thermal sensors estimate canopy temperature to monitor drought stress. The selection and deployment of these technical configurations are guided by the characteristics of the target vegetation, prevailing environmental conditions, and specific monitoring objectives. Accordingly, a standardized workflow is established, comprising demand analysis, sensor-platform integration, flight planning, data acquisition, geometric correction, image registration, and algorithmic inversion.
The analysis also highlights that machine learning and deep learning algorithms have profoundly transformed UAV data interpretation. Object detection models (e.g., YOLO), semantic segmentation frameworks (e.g., U-Net), and classification algorithms (such as Random Forest and SVM) have demonstrated exceptional performance in tasks such as individual tree identification, forest fire risk detection, and vegetation index estimation. These algorithms enable automated processing of UAV-collected imagery, making real-time precision agriculture and ecological monitoring feasible.
In terms of application domains, UAV-based plant information acquisition has evolved from early-stage forest inventory and land cover mapping to more complex scenarios, including carbon stock estimation, crop yield modeling, plant disease early warning, and habitat quality assessment. These applications span diverse vegetation types, such as forests, grasslands, wetlands, deserts, and urban green spaces. The study presents a comprehensive cross-matching analysis among vegetation types, disciplinary fields, sensor configurations, and data requirements, highlighting UAVs’ adaptability to address diverse ecological and agricultural challenges. UAVs are particularly effective in acquiring data in inaccessible or complex environments, such as steep mountainous terrain, intertidal mangrove zones, or fragmented urban ecosystems.
Despite these advancements, several critical challenges persist. Multi-sensor data fusion remains technically challenging, especially in scenarios requiring synchronization of LiDAR and hyperspectral data. Flight stability and obstacle avoidance in dense forests or urban areas remain limitations of current UAV systems. Additionally, the operational management of data collection, storage, and security has not kept pace with the rapid expansion of UAV-based monitoring. These challenges are further compounded by constraints in UAV endurance, payload capacity, and regulatory frameworks.
Looking ahead, UAV-based vegetation monitoring is expected to advance in three key directions. First, expanding application scenarios beyond traditional ecosystems is essential, with a focus on understudied vegetation types such as shrubs, ground covers, and low-stature plants—particularly in urban and semi-natural environments—to support comprehensive green infrastructure management and ecological restoration. Second, prioritizing multi-sensor data fusion and hybrid algorithm development will address data heterogeneity and enhance detection accuracy in complex settings. Finally, establishing standardized data governance protocols—encompassing secure transmission, storage, and sharing—is critical to balancing technological innovation with privacy and safety, fostering a sustainable and intelligent UAV monitoring ecosystem.
In conclusion, UAV-based plant information acquisition is at the forefront of environmental monitoring and precision management. Through continuous technological integration, algorithmic refinement, and application expansion, UAV systems hold the potential to revolutionize plant resource assessment, enabling high-resolution, multidimensional, and intelligent observation for sustainable landscape and ecological governance. |