Combining drone imagery and deep learning to automate species-level mapping of kelp forests

Ms Luba Reshitnyk1, Taylor  Denouden1, Keith R. Holmes1

1Hakai Institute, Victoria, Canada

 

The advancement of remotely piloted aerial systems (RPAS, or drones) has facilitated the collection of high-resolution imagery for mapping kelp forests (Giant kelp (Macrocystis pyrifera) and bull kelp (Nereocystis luetkeana)). However, there is a large latency between conducting a RPAS survey and deriving the spatial extent of kelp from RPAS imagery using current remote sensing or manual delineation methods. Very recently, the application of deep learning algorithms in the field of remote sensing is presenting novel avenues in our ability to detect patterns and objects in high-resolution imagery, therefore providing more efficient and accurate methods to study the spatio-temporal dynamics of kelp. Here, we use a dataset of paired RPAS imagery and species-level kelp classifications (n=26) collected in British Columbia, Canada and California, U.S.A., to train a deep learning algorithm (LR-ASPP MobileNetV3) to automate the detection of floating canopy-forming kelp in RPAS imagery. We compare the performance of the algorithm to classify (1) kelp presence/absence and (2) kelp species in RPAS imagery. The results demonstrate high accuracy and high efficiency in kelp classification from high-resolution imagery in both validation datasets and novel geographies under a variety of environmental and site conditions. The development of this open-source tool for mapping species-level distribution of kelp forests provides an opportunity to improve the efficiency and effectiveness with which we can use RPAS for monitoring kelp forests across the Northeast Pacific and beyond.

Presentation Slides – Luba Reshitnyk