Investigating collective animal movement with unmanned aerial systems
Collective behavior such as schooling, flocking, and herding is ubiquitous throughout the natural world. A growing body of theoretical and, primarily lab-based, empirical work predicts that group travel will benefit navigation and the accuracy of group decisions. For example, social travelers are hypothesized to pool the many independent directional estimates of the group, and have been shown to collectively sense and respond to complex and dynamic gradients. However, the mechanisms by which collective decisions emerge within animal groups in their natural environment remain unclear. Recent advances in Unmanned Aerial Vehicles (UAVs) and computer vision now allow the investigation of these questions in natural systems. By combining autonomous robotics with computer vision, we have created a software and hardware framework that enables the recording and processing of animal trajectories in situ.
Through the use of a machine learning algorithm we are able to locate animals, such as wildebeest, from processed video footage recorded from above. The accuracy of the method is comparable to the accuracy found in human-counted survey images of the wildebeest population in the Serengeti. We have further created a bespoke quadcopter that is designed to fly with minimal noise generation and can thus be flown at low altitudes over animal herds without disturbance. In combination these tools allow us to extract simultaneous movement trajectories of large numbers of ungulates. We demonstrate our method by tracking zebra and wildebeest interactions in the Serengeti National Park, and the behavior of Roe deer in an extensive grazing system in the UK.