Publication
Active-Learning-Driven Deep Interactive Segmentation for Cost-Effective Labeling of Crop-Weed Image Data
Freddy Sikouonmeu; Martin Atzmueller
In: Konferenzband der 43. GIL-Jahrestagung. GIL-Jahrestagung (GIL-2023) Resiliente Agri-Food-Systeme: Herausforderungen und Lösungsansätze. GIL-Jahrestagung (GIL), February 13-14, Osnabrück, Germany, GI, 2023.
Abstract
Active learning has shown its reliability in (semi-)supervised machine learning tasks to reduce the labeling cost for large datasets in various areas. However, in the agricultural field, despite past attempts to reduce the labeling cost and the burden on the labeler in acquiring image labels, still this load during the acquisition of pixel-level labels for semantic image segmentation tasks remains high. Typically, the respective pixel-level masks are acquired manually by drawing polygons over irregular and complex-shaped object boundaries. In contrast, this paper proposes a method leveraging the power of a click-based deep interactive segmentation model (DISEG) in an active learning approach to harvest high-quality image segmentation labels at a low cost for training a real-time task model by only clicking on the objects’ fore- and background surfaces. Our first experimental results indicate that with an average of 3 clicks per image object and using only 3% of the unlabeled dataset, we can acquire pixel-level labels with good quality at a low cost.