Objective
Operational flexibility to support the Department of Defense’s (DoD) training missions is well-served by conserving biodiversity and ecosystem integrity on DoD-managed lands and neighboring properties. DoD natural resource managers need reliable and cost-effective methods to monitor and conserve wildlife populations, especially in difficult-to-reach terrains such as vertical cliffs. The objectives of this project are to: 1) develop a time- and cost-effective approach for wildlife monitoring by incorporating near real-time artificial intelligence (AI) detection of wildlife, specifically cliff-nesting Golden Eagles (Aquila chrysaetos) and other raptors, based on images collected via Uncrewed Aerial Systems (UAS) flights and “you only look once” (YOLO) computer vision models, 2) validate the technology and protocols of this project by surveying in areas previously unsampled during model training and testing, and 3) create technology and protocols that can be transferable across DoD installations and generalized to other taxa.
Technology Description
YOLO is a newly emerging computer vision approach for automated and semi-automated wildlife detection from UAS-based imagery (Lawrence et al., 2023; Ma, D & Yang, 2022; Roy et al., 2023). As a single-shot detector model, YOLO has a relatively fast inferencing time making it appealing for real-time object detection in comparison to other two-step deep learning models such as convolutional neural networks that have demanding memory and processing power requirements (Lawrence et al., 2023). To accommodate real-time detection in harsh field environments with a need for portability, this project will use the deployment of the YOLO model on a rugged Azure Stack Edge Pro (or equivalent) device that has built-in graphical processing unit capability to enable image preprocessing and accelerated AI interfacing. The device is a hardware-as-a-service solution that leverages cloud capabilities of data storage and processing, yet can operate in disconnected mode for offline scenarios in challenging, remote areas (Azure Stack Edge Documentation, 2023). The goal of this project is to improve and build upon the YOLO model from prior work into a near real-time detection system to geolocate natural resources from UAS imagery and apply it to DoD sites in Idaho, Arizona, and Utah.
Benefits
Traditional methods of wildlife monitoring can require extensive time and labor costs because of the challenges of working in remote areas. UAS-based monitoring can improve access to difficult-to-reach terrain, and quality image- and thermalsensors have a high likelihood of detecting wildlife (or wildlife sign). However, post-flight image and data analysis is time-consuming and expensive. By creating a protocol to develop near real-time detection of wildlife with AI, the models will greatly decrease the costs of UAS-based surveys. DoD natural resource managers will be able to efficiently survey, monitor, and map sensitive species in cliff (and other) environments that are traditionally difficult to access. From the beginning, the project team will aim to develop the novel technology so that it can be broadly applied to other natural systems. The project team expects a good return on investment because once the technology is developed for near real-time detection of bird nests with AI, it can be used in many systems and for other applications. These benefits will enable the DoD to effectively manage and conserve biodiversity, including species that are protected by the Bald and Golden Eagle Protection Act, and the Migratory Bird Treaty Act.
References:
Azure Stack Edge documentation. (2023). https://learn.microsoft.com/en-us/azure/databoxonline/azure-stack-edge-pro-r-overview
Lawrence, B., de Lemmus, E., & Cho, H. (2023). UAS-Based Real-Time Detection of RedCockaded Woodpecker Cavities in Heterogeneous Landscapes Using YOLO Object Detection Algorithms. Remote Sensing, 15(4), 883. https://doi.org/10.3390/rs15040883
Ma, D, & Yang, J. (2022). YOLO-Animal: An efficient wildlife detection network based on improved YOLOv5. 2022 International Conference on Image Processing, Computer Vision and Machine Learning (ICICML), 464–468. https://doi.org/10.1109/ICICML57342.2022.10009855
Roy, A. M., Bhaduri, J., Kumar, T., & Raj, K. (2023). WilDect-YOLO: An efficient and robust computer vision-based accurate object localization model for automated endangered wildlife detection. Ecological Informatics, 75, 101919. https://doi.org/10.1016/j.ecoinf.2022.101919