Perception for Precision Agriculture Robotics using Deep Learning

Research output: Book/ReportPh.D. thesis

52 Downloads (Pure)

Abstract

The world population is expected to increase drastically in the next 30 years. Accordingly, we need to increase food production while treating the limited available land sustainably and keeping the environmental footprint low to decelerate global warming. Organic farming on a large scale is challenging because of the cost-intensive high demand of manual and physically demanding labor. Digitalization can pave the way towards more sustainable agriculture. Hereby, agricultural precision robots constitute one solution to enable large-scale organic farming with high yield while having a substantially lower environmental influence. The robots are equipped with sensors and actuators to perceive the fields and perform selective plant treatments. The shortage of workers for physically demanding field labor can be mitigated by robots that have the potential to be scaled up by running multiple robots both day and night. Therefore, agricultural robots in crop fields have gained great attention in recent years. Weed control is performed for plants of small growth stages, appearing predominantly isolated on soil background. On the contrary, grassland fields of dairy farming are less researched and introduce further challenges. There, weed plants appear on a cluttered background, consisting of other vegetation such as grass or clover. Weed identification becomes more challenging because we have to deal with more occlusions. Furthermore, foreground and background vegetation share similar pixel colors and chlorophyll content. In this work, we develop the perception system for a field robot operating in grassland fields. We target the most problematic grassland weed Rumex obtusifolius L., which spreads heavily when not treated. It leads to bad field yield and can be toxic for livestock when large amounts are consumed. The field robot applies experimental herbicide-free weeding using laser and electrocution techniques. We exclusively use deep learning approaches, namely convolutional neural networks, to tackle the visual identification and analysis of Rumex weeds. Robust deep learning models come at the cost of large amounts of data, whereas ground-truth annotations are expensive to obtain. We investigate approaches to reduce this cost and improve label efficiency. We generate plant-specific image compositions and use the synthetic images during supervised model training. Moreover, we obtain domain-specific representations with unlabeled data using self-supervised contrastive learning. Finally, we make two grassland datasets publicly available for the community: (1) RumexWeeds containing broad view images with coarse weeds annotations and (2) RumexLeaves containing close-up images of the weeds and fine-grained annotations of plant traits. Based on those datasets, we develop different perception models enabling our modular field robot to remove Rumex weeds efficiently. First, the weed and its approximate joint stem are detected on a broad-view image. The robot navigates toward the detected weed to position the weeding unit above the plant. A close-view image of the weed is taken, allowing for the extraction of more detailed plant traits, such as leaf, stem, and vein instances, enabling a more efficient weeding procedure but also allowing for automated detailed phenotyping. This is a cumulative thesis and it is based on six papers. The first five papers are peer-reviewed and published in different robotics and agriculture venues/journals. The approach for generating plant-specific image compositions was a finalist for the best paper award on agri-robotics at the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). The last paper is an integration paper and presents the final outcome of the joint work achieved within the underlying European project GALIRUMI. It is still under preparation and will be submitted soon.
Original languageEnglish
PublisherTechnical University of Denmark
Number of pages226
Publication statusPublished - 2024

Fingerprint

Dive into the research topics of 'Perception for Precision Agriculture Robotics using Deep Learning'. Together they form a unique fingerprint.

Cite this