Publication
Towards Efficient Dialogue Processing in the Emergency Response Domain
Tatiana Anikina
In: Vishakh Padmakumar; Gisela Vallejo; Yao Fu (Hrsg.). Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics. ACL Student Research Workshop (ACL-IJCNLP-SRW-2023), Student Research Workshop, located at 61st Annual Meeting of the Association for Computational Linguistics, July 10-12, Toronto, ON, Canada, Association for Computational Linguistics, 7/2023.
Abstract
In this paper we describe the task of adapting NLP models to dialogue processing in the emergency response domain. Our goal is to provide a recipe for building a system that performs dialogue act classification and domain-specific slot tagging while being efficient, flexible and robust. We show that adapter models Pfeiffer et al. (2020) perform well in the emergency response domain and benefit from additional dialogue context and speaker information. Comparing adapters to standard fine-tuned Transformer models we show that they achieve competitive results and can easily accommodate new tasks without significant memory increase since the base model can be shared between the adapters specializing on different tasks. We also address the problem of scarce annotations in the emergency response domain and evaluate different data augmentation techniques in a low-resource setting.