February 07, 2020

Large mass of waterfowl standing in water, photographed from the sky.

Professor Yi Shang and team are developing a deep learning system that will better identify small objects such as waterfowl in images.

Currently, when the Missouri Department of Conservation (MDC) wants to keep track of the waterfowl population in the state or monitor wetlands for invasive plant species, they send a natural resource manager into the field to do a manual count, which is very labor intensive and not very accurate. Other methods to monitor bird populations and wetland health include satellites and fixed-wing aircraft, but both are expensive and present logistical problems: the noise from planes can scatter flocks of birds and satellite image resolution is not high enough to count individual birds. So, the conservation department is turning to an engineering professor for a potential solution.

Yi Shang, a professor in the Department of Electrical Engineering and Computer Science, recently received a four-year, $372,000 grant from the MDC to assess the feasibility of using unmanned aircraft systems (drones) and deep learning computing techniques for waterfowl and wetland habitat monitoring. Shang, who has been with the University of Missouri for 20 years, says his graduate studies at University of Illinois at Urbana-Champaign were in artificial intelligence and neural networks, and he spent two years at the renowned Xerox Palo Alto Research Center in Silicon Valley working on wireless sensor networks and mobile computing.

Portrait: Yi Shang

Prof. Yi Shang, Department of Electrical Engineering and Computer Science.

“This project is a combination of both,” he said. “The drone would be the mobile device used to collect the sensor data, and the camera would be one type of optical sensor. The other part is the intelligence—object recognition and automatic detection, so this is where we are going to use the neural networks and deep learning, to process the imagery from the cameras.”

Getting an Accurate Assessment

Shang said waterfowl imagery collected from aircrafts can be difficult to interpret because birds can be confused with rocks or bushes if they are on land due to the low resolution of the images. Tracking invasive plant species with satellites is dependent on when the satellite is overhead and it is expensive. He said Google Earth is not a good option either.

“The drone can take much more frequent, real-time imagery with a good resolution—if you can fly at the right height without disturbing the birds,” he said. “But this is not easy, so we are trying to find the right combination—how high you fly and for how long before the drone’s battery is used up in order to get good, high quality data. Without good data, you cannot do much with the machine learning.”

In the initial stages of the project, Shang said the drones will collect video and imagery that will be downloaded and run through his machine learning models, which are being trained offline using previously collected imagery. Eventually, Shang hopes to be able to process the images on the drone in real time.

“Without downloading all of the imagery, which requires high bandwidth and long delays, you could do bird counting as the drone is flying, and then get closer to see what kind of bird it is and if it is male or female,” he said. “You could fly high and find a flock of birds, and then lower the drone and do a detailed study, so the drone becomes more intelligent and efficient. This aspect is part of the exploratory research of our project.” He said this kind of adaptive decision-making from a mobile device is an important research direction for his team.

Shang noted that machine learning has made great strides in facial recognition in recent years because the face has a lot of data points.

“How can we adapt facial recognition to bird recognition? This is not trivial because the features are very different and the models are different, so that’s also an important research direction of how to adapt and train our models that work on one kind of object to a different object and do it efficiently,” he said.

A Collaborative Endeavor

Shang’s Co-PI on the project is Lisa Webb, a cooperative associate professor with the MU School of Natural Resources and a research ecologist with the USGS Missouri Cooperative Fish and Wildlife Research Unit. Webb has worked on waterfowl and wetland related research projects in Missouri for the past nine years, with a focus on understanding the environmental factors that influence waterfowl distribution and body condition at different spatial scales. Shang and Webb are working with MDC project leaders Andy Raedeke and Joel Sartwell. Shang has been working with the MDC for the past few years, developing mobile apps for surveys at shooting ranges and fisheries. About three years ago, Shang began working with MDC on bird counting efforts, which was the impetus for his recent MDC grant.


Prof. Shang’s team will use drones and machine learning to track invasive plant species in wetlands such as this.

He said MDC also is concerned about the spread of invasive plant species, so that research aspect was included in his grant proposal.

“We will start with some invasive plants with higher color contrast to others in certain seasons, so it may be relatively easy to see them,” he said. “There are others that are not easy to distinguish, so we’ll start with the easy ones to see how well we can do using this technology.”

Shang also is beginning work on another project with a researcher in the School of Natural Resources who is interested in river bank stabilization and thinks the technology his team develops to monitor waterfowl and wetlands would work well in that capacity.

“The river bank changes over the years, so they want to survey the Mississippi River to check different bank stabilization methods,” he said. “They put big rocks along the banks to stabilize them, and sometimes it’s just trees and branches, but they want to check them all to see what’s most effective over time, and that involves taking pictures and then recognizing those automatically.”