- Existing methods for monitoring forest restoration success and habitat quality are costly, time-consuming, or both.
- A new methodology that converts images gathered by drones to 3D “point maps” can achieve accuracy comparable to current techniques, including LiDAR.
- The technique received the September Elsevier Atlas award for “research that could significantly impact people’s lives around the world.”
Rakan Zahawi had a problem.
Zahawi, a scientist with the Organization for Tropical Studies in Costa Rica, had teamed up with a group from the University of Maryland and the University of California-Santa Cruz to test a new technique to monitor the recovery of tropical forests. They were planning to fly unmanned aerial vehicles, a.k.a. drones, over a handful of restoration plots. The drones would snap thousands of photos of each plot, which the researchers would analyze with sophisticated software and algorithms to quantify each canopy’s structure, height, aboveground biomass and suitability for birds.
But all did not go as planned. “These drones would just fly off and crash” Zahawi said.
“The propellers just started exploding… ripping apart,” said Jonathan Dandois, a geospatial scientist on the team.
It was a costly and time-consuming mystery—the team speculated that the drones couldn’t fly in clouds, that there was too much humidity, that the drones were laid low by rain. Eventually, they discovered that a chemical sealant used to help hold the drones’ six plastic propellers in place had leaked from the rotor’s threads and onto the propellers, where it degraded and weakened the plastic.
They destroyed three drones in the process and had to return the following summer to complete the study, but the team’s perseverance paid off. Their research offers a new, low-cost way to evaluate forest regeneration. And their work received the Atlas award, which recognizes “research that could significantly impact people’s lives around the world or has already done so” from among the articles published each month in Elsevier’s more than 2,500 journals.
A middle ground
Researchers, government officials and land managers interested in evaluating restoration projects and calculating the biomass and accumulated carbon in an area usually rely on either manual measurements or an aerial technique called Light Detection and Ranging, or LiDAR. But the drone-mounted tools and analysis techniques deployed by Zahawi’s team could offer an alternative that is more efficient than manual assessments but nimbler and less expensive than LiDAR.
To conduct manual assessments of a site’s aboveground carbon stocks, researchers measure the diameter and height of trees and calculate the wood’s specific gravity. It’s slow, tedious, and — depending on the terrain and location — arduous.
Remote sensing methods such as LiDAR, on the other hand, are faster and more efficient but considerably more expensive. LiDAR uses lasers affixed to airplanes to create 3D maps of forests. The tool measures a canopy’s height by bouncing laser beams off treetops and off the forest floor, then calculating the distance between them from the extra time it takes light to reach the ground. The technique typically covers enormous areas (dozens or thousands of square kilometers in a single flight), but it can cost upward of $20,000 per flight.
In contrast, the six-rotor “hexacopter” drone and 10-megapixle camera system Zahawi’s team used cost about $1500, plus operator time, and can be used repeatedly. Of course, crashes that damage or destroy drones lead to higher costs.
Deciding which technology to deploy among satellite, LiDAR, drones and terrestrial surveys means weighing the scale, frequency and cost of the project. As a middle ground between manual surveys and LiDAR, drones are ideal for assessing individual plots on a regular basis, especially in areas with rugged or dangerous terrain.
In areas where cloud cover makes gathering data from satellite images more challenging, drone footage could also supplement satellite data. When piloted by an experienced user, the drone system can survey and capture images from a relatively large area. For example, Dandois and his team surveyed seven one-hectare plots on a single day, including the time required to drive between them.
The methodology is useful for other metrics, as well: Dandois spent the last year as a postdoc at the Smithsonian Tropical Research Institute in neighboring Panama, where he conducted weekly flights over a 50-hectare plot on Barro Colorado Island. By comparing the resulting 3D canopy surface models over time, Dandois was able to quickly quantify seasonal changes (phenology) in the forest.
The BCI plot took about 30 minutes to document, excluding travel time. Manually gathering information from each of the plot’s 2,000 to 3,000 trees about flowering, fruiting, and leaf presence would take a researcher a week or more, Dandois said.
Not so simple — Structure from movement and digital terrain model
But evaluating a restoration project or assessing seasonal changes in a forest is not as simple as flying drones willy-nilly over a patch of trees. In areas with rough ground or dense forest canopy, researchers need to know the region’s underlying topography so that they can compare it to the drone’s canopy cover data. That baseline information is called a digital terrain model, or DTM, and must be manually collected using GPS or borrowed from a previous LiDAR flight over the area. But researchers only need to gather the information once to create a DTM. “Once you have that, it never changes” Zahawi said, “unless you send a bulldozer in.”
After the drone’s camera snaps thousands of photos of an area, Zahawi’s team uses an open-source set of programs and algorithms called Ecosynth to create a 3D model, or ‘point cloud,’ of the forest. To stitch together the overlapping images, the researchers “take the photos, throw them at the computer, and the computer figures out how the photos should be arranged,” Dandois said. The software uses computer vision to identify features and match up overlapping images. Then the Ecosynth program uses “structure from motion” algorithms to deduce the configuration of the canopy. Structure from motion algorithms are kind of like the binocular vision used by humans, but instead of comparing images from both eyes at once, the software compares multiple images from a single camera as it moves around.
Dandois helped develop the methodology as a PhD student at the University of Maryland, Baltimore County, where he got started by attaching a camera to a kite and flying it around campus.
“One hundred years of ecology have been on the ground,” Dandois said. “Now we have the ability to pop right up [above the canopy] and see things on more or less a similar scale as a person walking around.”
An eye in the sky
Getting a bird’s-eye view of a forest can be advantageous for researchers. “Some of the data that we collected with this methodology we can’t collect as well on the ground,” Zahawi said. In some cases, important information just isn’t visible from the ground, and drones offer a “top-down versus a bottom up perspective,” Zahawi said.
By placing a 3D model over an area’s baseline topography — the DTM — researchers can quickly analyze a forest for different attributes, such as canopy roughness, or predict the presence of birds. “Some may be ecologically meaningless,” Zahawi said, but “you may hit on something that may be really important. So playing around with these technologies and the information that it provides for you is a really promising tool.”
Fruit-eating birds, or frugivores, play a key role in forest restoration. By dispersing plants’ seeds in a phenomenon evocatively known as “seed rain,” frugivores can help plants reestablish themselves in areas from which they’ve been eradicated. But those birds are more likely to visit areas with taller and more complex vegetation.
By interpreting the 3D forest models generated by Ecosynth, Zahawi’s team could predict the presence of frugivores about as well as researchers conducting ground surveys.
The technique has its limitations, of course. To traverse the expansive areas covered by a plane with LiDAR sensors, a drone would have to be flown many times, from widely dispersed launch sites. Also, in areas with few trees where little restoration has taken place, it’s difficult for Ecosynth to distinguish between low-lying vegetation and the ground—a feat that also poses a challenge for LiDAR.
Up, up and away
Although the technology and software for generating and analyzing aerial imagery have come a long way since Dandois first attached a camera to a kite and flew it over UMBC’s campus, there’s still room for improvement. And as the technology and software develop, researchers believe the tools will become more accessible and inexpensive. “Drones are fairly user-friendly, but they’re not that user-friendly,” Zahawi said. “Once [the package] becomes more streamlined and automated, I think more researchers will use it.”
Drone technology will “not replace what we do, but augment and streamline some of the data we collect,” Zahawi said.
“Usually you’re just slugging around in the understory trying to measure a few variables, and it takes a long time to collect a bit of data,” he said. “So I think it’s very exciting.”
But even for experienced and cautious pilots, things will go wrong — especially when deploying cutting-edge technology in uncertain field conditions.
“The moral of the story is, ‘You’re probably going to crash,’” Dandois said.
Dandois JP and Ellis EC. (2013). High spatial resolution three-dimensional mapping of vegetation spectral dynamics using computer vision. Remote Sensing of Environment 135:259-276.
Zahawi, R. A., Dandois, J. P., Holl, K. D., Nadwodny, D., Reid, J. L., & Ellis, E. C. (2015). Using lightweight unmanned aerial vehicles to monitor tropical forest recovery. Biological Conservation, 186, 287-295.